Writing about online piracy and Google is a bit like living in the movie Groundhog Day. Day after day reality repeats itself. However, unlike Groundhog Day, there are no minor changes in the timeline that lead to a different outcome. When it comes to Google Drive and piracy, the story remains the same, day after day, year after year.
I’ve written about Google Drive several times over recent years, highlighting its ongoing role in giving online pirates convenient (and free) storage to make it easy for them to “share” pirated movies. Aside from providing a safe haven for pirated content, Google also blatantly defies the “safe harbor” a DMCA requirement that infringing material be “expeditiously” removed…..In reality, it can take weeks for reported links to be removed from Google Drive.
(1) In general.—A service provider shall not be liable for monetary relief, or, except as provided in subsection (j), for injunctive or other equitable relief, for infringement of copyright by reason of the storage at the direction of a user of material that resides on a system or network controlled or operated by or for the service provider, if the service provider— (A) (i) does not have actual knowledge that the material or an activity using the material on the system or network is infringing; (ii) in the absence of such actual knowledge, is not aware of facts or circumstances from which infringing activity is apparent; or (iii) upon obtaining such knowledge or awareness, acts expeditiously to remove, or disable access to, the material;
(i) Conditions for Eligibility.— (1) Accommodation of technology.—The limitations on liability established by this section shall apply to a service provider only if the service provider— (A) has adopted and reasonably implemented, and informs subscribers and account holders of the service provider’s system or network of, a policy that provides for the termination in appropriate circumstances of subscribers and account holders of the service provider’s system or network who are repeat infringers; …
Here are some images from very the same Google Drive account reported multiple times in October of 2017 for several dozen infringing titles, and again in 2018 and 2019. Despite being reported for dozens of pirated movies (many of the same titles) over several years, the account remained online.
Graphic from 2017Graphic from 2019
Despite dozens of DMCA notices over years the account remains untouched
Now, in 2020, the same Google Drive account (folder beginning with OB4Q) remains active and has re-uploaded infringing links (this time as torrent downloads) to the very same films previously removed.
Not to beat a dead horse, but why is Google allowed to sidestep the DMCA safe harbor requirements? Like other big tech companies, Google seems to routinely operate above the law. I guess having a well-paid gang of lobbyists in Washington has its benefits eh?
For easy reference, here are some of my past posts about this notorious Google Drive account from 2017 through 2019:
Of course, Google Drive piracy doesn’t damage only indie filmmakers like me. Even the big guys seem ripe for the picking. Unlike Google’s YouTube, Google Drive doesn’t provide any Content ID technology that would allow content creators (some) means to protect their work from this rampant theft. Here’s a Google Drive account I came across recently featuring dozens of Disney films. So much for needing to pay to subscribe to Disney+ eh?
YouTube is a money machine for Google. While actual numbers are hard to come by, it’s estimated that the online video hub brings in upwards of $15 billion annually.With that much money at stake, it’s not surprising that its business model continues to put profits over people.
YouTube content is often offensive, violent and awful
In 2015, following the on-air murder of a television reporter and her cameraman, gruesome videos of the event were quickly posted on YouTube with with ads alongside. I wrote a post about it at the time:
Ads for Chile’s Restaurant and Aria Resort appear on clip of WDBJ murder.
It’s no secret that YouTube slaps advertising on pretty much anything without regard for subject matter or ownership, but making money off of last week’s on-air murder of WDBJ-TV reporter Alison Parker and her cameraman Adam Ward is a new low. A source tipped me off to the fact that a number of opportunistic (and shameless) YouTube “partners” have uploaded and monetized clips of both the station’s live broadcast and the video taken, (and uploaded to Twitter) by the deranged murderer as he executed the two journalists during a televised live-shot for the morning news.
4 years later, it’s as if nothing as changed.
This week Andy Parker, the murdered reporter’s father, wrote an emotional piece for the Washington Post describing how YouTube/Google’s business practices continue to damage his family to this day.
After establishing a foundation to support arts programs for underserved children in Virginia and advocating for gun safety to prevent events like that which took his daughter’s life, his family has become the target of conspiracy theorists.
They have taken the gruesome footage of my daughter’s murder, edited it into videos selling these lies and flooded YouTube with hate-filled diatribes maligning my family.
The vitriol directed at me and my family has been unbearable. So I was outraged to discover that recommendation algorithms for YouTube and its parent company, Google, have bolstered these conspiracy theories.
Parker puts blame for this clearly at the Google’s door, “As much as I want to blame the sick creators for the pain I feel, I blame Google even more. By surfacing this content and profiting from the data Google collects from those who view it, Google is monetizing Alison’s death and our family’s pain. “
Of course the Parker family’s experience is only example one in a long list of bad behavior by YouTube. In the past, the company has monetized everything from terrorist training videos to scenes promoting the sexual exploitation of children.
Algorithms making kiddie porn easy to access and money for YouTube
Just this past month, YouTube’s algorithms have come under more direct fire for “facilitating the sexual exploitation of minors” after YouTuber Matt Watson posted a video demonstrating how YouTube’s suggested videos took users to a series of videos showing children in various states of undress which featured comments with links to child pornography could be found. Watson told ArsTechnica:
“YouTube’s recommended algorithm is facilitating pedophiles’ ability to connect with each other, trade contact info, and link to actual CP [child pornography] in the comments,” Watson reported. “I can consistently get access to it from vanilla, never-before-used YouTube accounts via innocuous videos in less than ten minutes, in sometimes less than five clicks.”
Wired also ran an extensive report on YouTube’s ongoing porn video monetization problem and how pedophiles use its comment section as a virtual bulletin board.
Videos of children showing their exposed buttocks, underwear and genitals are racking up millions of views on YouTube – with the site displaying advertising from major cosmetics and car brands alongside the content.
Comments beneath scores of videos appear to show paedophiles sharing timestamps for parts of the videos where exposed genitals can be seen, or when a child does the splits or lifts up their top to show their nipples. Some of the children in the videos, most of whom are girls, appear to be as young as five. Many of the videos have hundreds of thousands, if not millions of views, with hundreds of comments.
Of course, when confronted with such evidence Google belatedly responds, but only in the short term. For a company with such massive reach and resources, why can’t safeguards be put in place to prevent such behavior? They certainly have the means. What the company lacks is the will. Clearly, for Google/YouTube, it pays to look the other way.
YouTube prefers money to the moral high road
We’ve seen this YouTube’s approach to online piracy of films and music for years and despite repeated calls for change, we’ve only seen these insidious tendrils of exploitation for the sake of profit continue to grow, not recede. Will Washington ever wake up? Well, maybe.
Today Senator Elizabeth Warren announced that if she’s elected president she will break up the big tech companies like Google and Facebook. In Medium post published today describing her plan she characterized the issue this way:
As these companies have grown larger and more powerful, they have used their resources and control over the way we use the Internet to squash small businesses and innovation, and substitute their own financial interests for the broader interests of the American people. To restore the balance of power in our democracy, to promote competition, and to ensure that the next generation of technology innovation is as vibrant as the last, it’s time to break up our biggest tech companies.
Warren’s proposal is an important beginning. It’s long past time for the U.S. government to take action against a tech industry that has managed to avoid any semblance of regulation by repeated the tired mantra that reasonable regulation would “stifle innovation.” In fact, regulating the industry would do the opposite. It would create a fair and sustainable marketplace not dependent on the exploitation of others for its success.
Innovation takes many forms. The version of “innovation” created and currently promoted by the likes of YouTube is not necessarily one a civilized society should aspire to.
When people talk about effective ways to mitigate the impact of online piracy, YouTube’s Content ID is often used as an example of what works. Unfortunately, despite its role as poster boy for anti-piracy tech, in reality it falls flat as a gatekeeper against online piracy.
Aside from a labyrinth-like user interface that seems likely to have been designed–not to help– but to discourage rights holders from using Content ID, the actual fingerprinting technology behind it can be easily fooled.
YouTube introduced the Content ID system in 2007. At the time, the company was facing pressure from a Viacom lawsuit, among others. According to YouTube, it’s pretty straightforward:
Videos uploaded to YouTube are scanned against a database of files that have been submitted to us by content owners. Copyright owners get to decide what happens when content in a video on YouTube matches a work they own. When this happens, the video gets a Content ID claim.
Looking to make money off work they don’t own, clever YouTube users have discovered ways to fool the technology so their illegal uploads of copyrighted movies and music don’t get flagged, blocked or removed.
I began noticing this phenomenon more lately as I’ve begun to find full, infringing copies of films uploaded that matched content owned by a film distributor I work for. This seems to be happening more often and I was curious as to how these pirated copies had avoided detected by Content ID. When I looked closely I saw that subtle manipulations in brightness had taken place along with slight adjustments to frame size and sometimes the crop of the frame.
When I started poking around YouTube to find other examples of these uploads they were easy to find. It only took me a few minutes to find dozens of copies of a variety of full copyrighted movies, old and new. One title I came across was the movie, Everest. Below are screen captures from two different full uploads of the movie I found streaming on YouTube.
Two full copies of the movie Everest uploaded to YouTube.
In this case the uploader had used several techniques to avoid detection including reversing the frame (note the backwards title), darkening the lower part of the frame and cropping it. Of course, having recently viewed the film on HBO, watching a lousy copy like this on YouTube wouldn’t be my choice, but apparently others didn’t mind. Uploaded only a month ago, the movie had already racked up more than 16,000 views.
Pirate uploads make money for uploader and for YouTube
Why go to all this trouble to manipulate a movie for upload to YouTube? Well, it’s the age-old pirate motivator–money. This uploader, who goes by the name Kenneth Lamb, has claimed ownership of this content and monetized it with ads. He makes money. YouTube makes money. The movie’s actual production companies make nothing.
This YouTube user claims to own rights to Everest movie worldwide and makes money off ads
In an ironic twist, several of the ads that appeared when I was examining (and reloading) this pirated copy of the film were for films including DreamWork’s upcoming movie Trolls and Warner Brother’s Jason Bourne.It’s more than a tad ironic that Hollywood studios are (inadvertently) putting cash in YouTube’s hands via advertising on a pirated copy for one of its own productions.
Ultimate irony that ads for upcoming movie releases are featured on pirated copies of Hollywood films
I don’t deal with music or audio files on YouTube but there are similar manipulations happening there as well where uploaders resample, add noise, etc. to fool the Content ID system into ignoring the file.
What can YouTube do to fix this growing problem? Per usual, the list is long and varied, but begins with asking Google engineers to design better fingerprinting tech. There are other companies that offer digital fingerprinting technology seem to do a better job catching these circumventions. If I can easily uncover an upload is a copy of the movie Everest, why can’t Content ID? You can’t tell me that with all its financial (and technological) resources YouTube doesn’t means to upgrade its system?
Technological solutions exist. It’s just a matter of priorities. Stopping piracy isn’t a priority for YouTube.
Aside from updating its fingerprinting capabilities, YouTube could also improve the Content ID system through providing a better interface, more transparency, better compensation for artists, etc. Of course again that would mean lower profits for Google/YouTube so such straightforward fixes are unlikely. Meanwhile, YouTube makes great hay out of its concerns for poor, maligned users who may have received an erroneous DMCA notice. The company is willing to spend money to defend a few select uploaders but won’t spend resources to fix its broken Content ID system?
Operating only a marginal (not great) Content ID system is in YouTube’s best interests
Of course the powers that be at YouTube probably prefer to keep Content ID just the way it is–creaking along, occupying a neutral zone positioned between accolades and scorn. It’s a safe position, one that gives YouTube officials cover when they use disingenuous excuses about their anti-piracy practices to critics, while avoiding any real (legal or financial) consequences.
Content ID does the job just well enough….but that doesn’t mean it does a good job. It could serve as a true model for technological safeguards against piracy, but as now, it’s merely a slight bump in the road for those determined to steal and monetize the works of others. Meanwhile, YouTube continues to pocket advertising cash, make its stockholders happy while leaving filmmakers and musicians on the outside, looking in.
Time for YouTube to get serious about cleaning up all the junk, spam and malware files on its site
YouTube is great for finding videos about pretty much everything. Need to learn how to fix a furnace or use the latest camera equipment? There’s bound to be a video shows you how. Unfortunately, amid the useful stuff, YouTube is also chock full of garbage. The question is, with its massive technical resources, why doesn’t the site do a better job keeping house?
I’ve written before about the epidemic of fake “full-movie” uploads that fill YouTube. That was in 2012. Now, six years later, the problem still exists. Apparently, YouTube isn’t concerned that its pages are full of spam files, many of them fake pirate movie uploads that lead users to sites rife with malware and money-making scams.
These fake uploads, promising full copies of hundreds of films, both indie and mainstream, are easy to find. Go to YouTube, search for a specific film title using the term “full movie,” and voilà, most results will lead to garbage. These bogus uploads fall into two categories. Some offer links to other dubious websites while others are merely dummy files uploaded to generate advertising income(for the user and YouTube). Some do both.
As for these offsite scam movie/gaming portals, it’s difficult to figure out who is actually behind them. The site URLs vary and include tzarmedia.com, gnomicfun.com, cnidaplay.com, jabirufun.com,flogame.com, among dozens of others. Curiously, a WHOIS search for these various domains indicates they are all registered via the same domain registrar, enom.com. One can’t help but suspect that this particular business model is being orchestrated by a few, linked operators. When I called their customer service number to ask questions I was given the proverbial run-around. Other contact information was essentially non-existent.
These scam movies sites share domain registrar and likely more
Why are such dummy files an issue? Not only do they pollute legit searches for content on YouTube, but they make the process of reviewing pirated content more difficult for rights holders. When I search for copies of my film using my Content ID account, I have to wade through dozens of these fake uploads.
Removing them is an incredibly time-consuming task as it seems YouTube has purposely chosen to make the Content ID dashboard as inconvenient as possible for users.
When I get page of results that is nothing but dummy uploads why can’t doesn’t YouTube offer a select all option so that I can remove them en masse? Instead–if I want to remove them–I’m forced to click and open each one and go through a 5 step process: select takedown, select title, acknowledge, fill in my signature and then click takedown. Instead, why not offer a select all option?
Another interesting twist is that many of these fake movie uploads also share links to legit social media sites like MTV’s Facebook or The Wrap’s Twitter account. I checked to see whether the operators of these sites knew about this and was assured they didn’t. It would appear these dummy uploads include such links drive more traffic to the bogus uploads and make them seem legit.
What can YouTube do to prevent this scheme? Why not utilize their own fingerprinting tech (Content ID) to detect and block these dummy files? If necessary, why not employ a team of actual humans to help with the task. I imagine if their engineers put their minds to it the task would be a relatively simple one. Certainly YouTube can afford to invest in keeping its house tidy?
Why is do advertisers allow themselves to be part of this junkyard?
Perhaps YouTube doesn’t take action against such uploads because it makes money off them? Here are just a few examples I found recently–bogus uploads with advertisements for New York Life, Walmart, Tide and Walgreens. These fake pirate full-movie uploads emblazoned with ads are a dime a dozen on YouTube. Do these advertisers know what they’re paying for? Do they care? Perhaps TAG, the Trustworthy Accountability Group, should take a look at this situation and pressure YouTube to take action.
Can you think of any other business that could get away charging money this for type of thing? Isn’t Internet commerce–and YouTube–mature enough at this point to operate a business where what you see is what you get? Apparently not… Imagine walking down the aisles of Target and finding half the merchandise to be knock-offs or empty boxes?
It’s not only the spam fake movie uploads and advertising scams that are problematic. As a study by the Digital Citizens Alliance found, YouTube is also rife with uploads that link to various types of malware including RATS (Remote Access Trojans), used by hackers to install malware that hijacks computers of unsuspecting internet users. Why is it OK for YouTube to continue to allow activity that scams–and possibly endangers–users?
As I mentioned, YouTube has the technical expertise and financial means to develop better algorithms and Content ID matching to weed out these garbage uploads if it chose to do so. Until then, the site will increasingly resemble a hoarders home with junk stuffed into every conceivable corner. Is that any way to run a business?
Why not make Content ID more accessible and transparent?
Much has been written about YouTube’s Content ID program, a fingerprinting technology that allows rights holders to find and claim their music or movies when uploaded to YouTube. The technology was introduced in 2008 in the wake of Viacom’s lawsuit against YouTube and since then has helped (some) creators mitigate the problem of piracy on the popular UGC (user-generated content) site.
Those who have access to the Content ID system can uploaded reference files and use a dashboard to choose how matches should be handled. They can be limited based on audio, video, and length. Matching content then can be blocked, removed, or monetized based on territorial rights.
I’ve written many pieces about what works and what doesn’t when it comes to Content ID so I won’t be redundant here, but this week I read some pieces which highlight some lingering issues that continue to limit the reach (and effectiveness) of this technology–most notably limited access and accountability.
Are audiobooks being ignored?
Author Ryan Holiday published a piece in The Observer asking, “When Will YouTube Deal With Its Audiobook and Podcast Piracy Problem?” He described how using search, he’d found the audio version of one of his books streaming, in full, on YouTube. The audio had been streamed 16,000 times. He observed, “It might not seem like a ton but the book had sold about 50,000 copies in audio—an additional 30% of that figure pirated it through a single video?”
Holiday goes on to repeat the oft-heard lament of filmmakers, musicians, et al who have found their pirated works streaming on YouTube. Like many of them, Holiday believes YouTube needs to make it easier for artists to protect their creations from this type of theft:
For its part, YouTube needs to get its act together and offer tools directly to publishers and authors. Audiobook piracy is real and clearly growing. The idea that songs and television and films all deserve protections from Content ID but authors don’t is absurd.
Now, to be fair, it’s not clear that Holiday himself has ever applied for Content ID access. It seems that the YouTube’s language dissuaded him.
I can continue to file these claims as the author but since I’m not a major publisher with a “substantial body of original material,” I can’t participate in YouTube’s Content ID personally.
If I were an author (or publisher) I would not hesitate to at least try to apply for a Content ID account. I had no difficulty being approved for a Content ID account in 2010 and only own the rights to two films, a feature and a documentary that I co-produced. The companies that distribute my film also have Content ID for their film catalogues.
…an evaluation of each applicant’s actual need for the tools. Applicants must be able to provide evidence of the copyrighted content for which they control exclusive rights…Content ID applicants may be rejected if other tools better suit their needs
The key here is the last sentence. There’s also Content Verification which seems to be another, higher tier of Content ID aimed at large companies. The other “tool” is simply sending a takedown via web form. That may be appropriate in very limited situations, but using it can be time-consuming and also requires a rights holder be proactive in searching for infringing content. Since Content ID does that work for you it’s the key reason it should be made available on a much wider scale. As Holiday noted, one copy of his book was accessed thousands of times….shouldn’t the onus be on YouTube to put a roadblock up to prevent infringement? Why should creators be required to be copyright cops too?
Lack of transparency has long been a flash point for musicians vs. YouTube
Frustration over YouTube’s Content ID and monetization scheme was also at the core of a blog post by 5-time Grammy winner, musician Maria Schneider published on Music Tech Policy. Schneider has become a strong advocate for artists rights and attacks Content ID on a number of levels including skewering YouTube over its lack of transparency (a common complaint among many musicians) and for its apparent refusal to grant access to artists–even those like her–who are Grammy recipients.
Content ID is reserved for big record companies with big catalogues, and probably selected independent artists whom YouTube believes will make YouTube a heap of money…In the press, YouTube has fought back against the recent flood of criticism, saying that all rights-holders can access Content ID – that they can get it through “third-party vendors.” These third party vendors often take between 20% to 50% of the revenue paid by YouTube—after YouTube takes its share.
I highly recommend reading Schneider’s post for her account of how Content ID has failed her and other musicians. As a filmmaker, in terms of using Content ID simply to combat piracy, my experiences have differed, but I do share many of her overall concerns. Unfortunately, the fact she (and others) are apparently being denied direct access to Content ID tools is an ongoing issue that inflames the lingering, legitimate mistrust creators have with Google.
Does it really need to be this way?
In spite of years of ongoing complaints like these, neither Google nor YouTube seem really to have made a genuine effort to work with artists–instead preferring to stonewall or sidestep debate. When YouTube officials do comment, they often find themselves tangled up in webs of their own making, as was the case when musician Zoe Keating took them to task in a very public exchange. Why not work with creators to solve some of these problems instead of demonizing them? There are ways to find common ground if only the powers that be at Google would care to (really) listen.
Content ID could make it easier for creators of all stripes to ask permission instead of simply taking content
Moving the Content ID machinery out of the shadows could pay dividends in other ways–perhaps by helping bridge the divide in disputes over copyright. Maarten Zeinstra, an “advisor on copyright and technology” recently penned a thoughtful blog post proposing that YouTube’s Content ID could become a useful tool for those who interested in utilizing content in legitimate ways. His piece, “YouTube should open up Content ID” was published this past May on the Dutch website, Kennisland. In it, Zeinstra noted:
YouTube should open up Content ID and make their register of rights holders publicly available...Let anyone be able to contact the rights owners of a certain clip or publicly contest that ownership. This creates an innovative new possibility in using content with permission or under copyright exceptions, and create legal certainty for copyright holders and remixers alike.
Now, I’m a tad skeptical as to what he means by “contest that ownership” but I’m open to the possibility that he’s merely describing a middle ground. If someone wants to make a mashup video using clips from my film or segments of archival footage from my documentary they could use Content ID to find that I am the rightful owner of the footage and can ask permission. Personally, I support the idea of fair use but also appreciate the fact that one should ask permission. I did so with footage used in my documentary and was always met with a positive response. Perhaps opening up Content ID in this way could foster a new respect for the work of creators and support the idea of asking, and not simply taking.
Improving Content ID would be in everyone’s best interests
Clearly, Google needs to do a much better job in providing access and accountability with its Content ID and monetization programs. Expand outreach to indie artists. Includethem in discussions about how to improve Content ID. Update the interface to make it more intuitive and user-friendly. Open the books so that creators can see exactly how much revenue is earned and where it goes. Be innovative and use Content ID to open new avenues to legitimate use of copyrighted content.
There’s little doubt that YouTube’s Content ID is a powerful tool that’s in dire need of an overhaul, both in terms of who uses it and how it’s used. This technology could provide so much more than it now does–but the ball is in Google’s court. I won’t hold my breath waiting for something to change–but there’s always hope isn’t there?
YouTube users claim Fair Use as a defense for uploading full copies of pirated movies
There was a lot of talk about fair use and takedown abuse at last week’s U.S. Copyright Office Section 512 roundtables in San Francisco. Many of those who spoke, bemoaned how poor, innocent uploaders were victimized, time after time, by malicious DMCA takedowns.
It’s a tried and true talking point, convenient, but disingenuous all the same. Some of us, myself included, tried to make the point that creators, whose work is routinely (and massively stolen), are often (doubly) victimized bymalicious fair use claims.
I thought I’d share an example of this that occurred just this week on YouTube. On Tuesday a full-copy of the Swedish indie film “Kyss Mig” (all 147 minutes of it) was uploaded to YouTube by a user aptly named “Free Movies.” As an added flourish, the user-name included the notation, “free movies bitches.”
In this instance YouTube’s Content ID system worked as intended. The Content ID user (an indie film distributor) had set the system to block uploads of a certain length in its territories. Even though the video was a full, pirated copy of the film, it wasn’t taken down, it was simply blocked. So far, so good right?
Wrong…This YouTube user didn’t seem to think the rights holder had the right to block the full, infringing copy and promptly disputed the block. S/he stated the reason as being:
Approval from copyright Holder is not required. It is fair use under copyright Law.
The user also added a note: “I don’t need to explain.” Clearly Free Movies didn’t bother to read YouTube’s information on disputing a claim or its explainer on fair use.
Despite all the testimony at last week’s roundtable about fair use–and how copyright holders seek out to punish those who claim it using malicious takedowns–it’s worth pointing out, yet again, that for every legit “fair use” claim, there are also false, and rather malicious, abuses of that defense. It’s a fact conveniently overlooked by the anti-copyright apologists.
Bogus “fair use” claim on YouTube
Take a gander below at the actual screen caps documenting this bogus “fair use” claim. Hopefully, officials considering DMCA reforms will acknowledge that creators can be twice victimized by abusive fair use claims.
I did in fact “reinstate” the claim (on behalf of the indie distributor I work for) so we’ll have to wait and see if this user goes on to file a counter-notice. If s/he does so, the film, in its entirety, will return to YouTube even though it’s CLEARLY infringing because we don’t have the financial resources to enforce the removal in federal court.
I’ve had the same thing happen after full pirated copies of our film were uploaded to YouTube. For creators trying to protect their work it’s a lose, lose…Perhaps YouTube should require it’s users to review “fair use” and “copyright” before they are allowed to uploaded content of a certain length? Why should creators be twice victimized while uploaders walk away unscathed?
As an indie film and broadcast journalism veteran, I'll share my perspectives on issues of interest to the creative community and beyond--Ellen Seidler