Slowly but surely, Alexa’s becoming a more competent catchall video assistant. Back in January, Amazon launched its Video Skill API designed to offer more control over apps from cable and satellite companies. An update this week brings the ever-important ability to use the smart assistant to start recording.
The skill joins a number of functions already available from top providers, including Dish, TiVo, and DIRECTV and Verizon — each of whom will likely be updating their Alexa skill set to reflect the new feature. The whole thing works pretty much as you’d expect.
Say, “Alexa, record the A’s game,” and the associated service will do just that. Or, you know, any baseball team, really.
Also new in this update is the ability to jump directly into frequently used navigation options, like DVR interfaces or video services like Netflix or Prime, the example that Amazon gives in its post on the topic. Once in a specific program, users can ask Alexa to do things like pause the show, and the assistant will comply.
The new skills are available now to developers and should be hitting some of the aforementioned services soon.
WhatsApp has added a much-requested new feature after it began to allow users to make group voice and video calls.
It’s been just over three years since the company, which is owned by Facebook, introduced voice calls and later a video option one year later. Today, WhatsApp counts over 1.5 billion monthly users and it says they make over two billion minutes of calls via its service each day.
Starting this week, callers can now add friends by hitting the “add participant” button which appears in the top right corner of their screen. The maximum number of participants is four and, impressively, WhatsApp said the calls are end-to-end encrypted.
That’s not an easy thing to do. Telegram, a self-professed secure messaging app, hasn’t even gotten around to encrypting its group messaging chats, let alone group calls.
On the encryption side, WhatsApp has long worked with WhisperSystems to cover all messages and calls on its platform from prying eyes and ears. That said, the relationship between the two become a little more complicated this year when WhatsApp co-founder Brian Acton donated $50 million of his wealth — accumulated from Facebook’s acquisition of his company in 2014 — to the Signal Foundation, which is associated with WhisperSystems.
Acton quit Facebook last year — this year he encouraged people to delete the social network for its data and privacy screw-ups — while his fellow WhatsApp co-founder Jan Koum joined him in departing in May of this year.
Like Acton, Koum was apparently irked by scandals such as Cambridge Analytica, although his on record explanation for quitting was to “do things I enjoy outside of technology, such as collecting rare air-cooled Porsches, working on my cars and playing ultimate frisbee.” Each to their own…
Rachel Moore, the former director of Product at computer vision startup Pilot AI, is suing its co-founders CTO Robert Elliot English and CEO Jonathan Su for sexual harassment, discrimination, retaliation and wrongful discharge. Pilot AI’s human resources provider TriNet and its Series A investor NEA are also named in the suit filed in San Francisco Superior Court today. The plaintiff, a 24-year-old Master’s graduate of Stanford University, is seeking a trial by jury.
The suit alleges that Su and English created a hostile work environment colored by sexually inappropriate comments, including discussions of pornography, English’s sexual exploits and that he “participated in an anal sex workshop at Burning Man led by a famous porn star.” Only when Moore agreed to participate in the crude comments was she awarded more status in the company and a $20,000 raise.
English allegedly later invited Moore into his office, closed the door, dropped his pants while talking about his ex-girlfriend and initially refused to let Moore leave. For rejecting his advance, he then began to retaliate against her in the workplace, according to the suit. It states that Moore reported the incident to Su who dismissed the allegations as English being “sexually frustrated.” Su is said to have encouraged Moore to ask English out on a dinner date to resolve the issue, which she eventually declined out of fear for her safety.
Su later urged Moore not to file a formal report about the incident in English’s office because it could “end the company,” according to the suit. She did, prompting an investigation by law firm WilmerHale. Moore agreed to participate only if the law firm remained neutral and did not act as counsel for Pilot AI. NEA’s Rick Yang, who sits on the board, is said to have overseen the investigation.
The suit calls the investigation “an utter sham and a cover up.” NEA allegedly took the position of refusing to disclose the investigation report, claiming attorney-client privilege. Yang is said to have eventually disclosed a summary of the report that confirmed the pants-dropping incident, but found there was nothing sexual about it, and there were no repercussions for the founders. After the investigation, Moore allegedly requested a leave of absence rather than returning to the office where she would have to report to the defendants. When additional leave requests were ignored, she inquired about her employment status, and allegedly ceased to be paid or have access to company systems and concluded she had been terminated.
The charges filed include quid pro quo sexual harassment, hostile working environment harassment, discrimination based on gender, retaliation, failure to prevent or correct harassment, aiding and abetting harassment, wrongful discharge, intentional infliction of emotional stress, failure to pay wages, waiting-time penalties and violations of labor and business codes.
Requests for comment from English, Su, Pilot AI, NEA and TriNet were not returned before press time. Moore’s law firm Arena Hoffman LLP issued this statement to TechCrunch, from its attorney Ron Arena:
As alleged in the complaint, Ms. Moore contends that she was subjected to a sexually charged workplace, where Pilot AI’s founders discussed anal sex workshops, boasted of sexual conquests on Tinder, named a server ‘Deep Head,’ and let executives drop their pants in a meeting and call Ms. Moore’s footwear ‘fuck me boots.’ Ms. Moore alleges that her complaints were ignored, then swept aside in a sham investigation – after she declined the CEO’s direction to meet her pants-dropping supervisor alone on a dinner date.
We’ll have more info as it becomes available and will update with comments from the parties involved.
Uber is closing the doors on its on-demand package delivery service for merchants, RUSH, in New York City, San Francisco and Chicago, TechCrunch has learned. In an email to users, Uber said it plans to close RUSH operations June 30, 2018.
“At Uber, we believe in making big bold bets, and while ending UberRUSH comes with some sadness, we will continue our mission of building reliable technology that serves people and cities all over the world,” Uber’s NYC RUSH team wrote to customers.
Uber has since confirmed the wind-down.
“We’re winding down UberRUSH deliveries and ending services by the end of June,” an Uber spokesperson told TechCrunch. “We’re thankful for our partners and hope the next three months will allow them to make arrangements for their delivery needs. We’re already applying a lot of the lessons we learned together to our UberEats food delivery business in over 200 global markets across more than 100,000 restaurants.”
With UberRUSH, which I forgot still existed, people can request deliveries for items no more than 30 pounds in size, except animals, alcohol, illegal items, stolen goods and dangerous items like guns and explosives. Last April, Uber stopped providing courier services to restaurants, encouraging them to instead use UberEATS, the company’s food delivery service. The shutdown of UberRUSH comes shortly after Shyp, an on-demand shipping company, announced its last day of operations.
In what might be the most ridiculous stunt ever pulled in the art world, a Banksy piece has, in a sense, self-destructed. Right in front of an audience of would-be buyers.
A framed canvas version of Banksy’s Girl with Balloon was set to be auctioned at Sothebys in London. As the auction came to a close with a final bid of £953,829 (a little over $1.25 million), the print’s frame began… beeping. Then, whirring. Seconds later, the canvas slid through the bottom of the frame, now almost entirely shredded.
The anonymous artist has long expressed a dislike of art galleries reselling their works, down to creating a piece featuring an audience of bidders battling over a print that reads, simply, “I can’t believe you morons actually buy this shit”. This seems to be Banksy’s latest way of expressing their discontent.
Of course, it’s easy to argue that the whole thing makes the piece even more desirable, because, well… art. If people with mountains of cash are buying art to have a ridiculously rare conversation piece that they hope others recognize, this one just rocketed up the list. It’s now that piece. Or technically those pieces, I guess.
Curiously, the canvas didn’t make it all the way through the shredder — did it jam, or was that intentional? By leaving about 1/3 of the print in the frame, the shredded bits are left attached and dangling… thus preventing them from splitting the pile of shreds into 50 more auctions with everyone vying for a slice.
So how did it all work? Writer Zoe Smith shared a video on Twitter this morning that she notes appeared briefly on Banksy’s Instagram before being pulled down (Update: it’s now back up! See below). It shows what looks to be the inside of the frame (which, in hindsight, seems comically large), shredder and all:
Update — here’s the video, as reposted by Banksy:
View this post on Instagram
. "The urge to destroy is also a creative urge" – Picasso
A post shared by Banksy (@banksy) on Oct 6, 2018 at 10:09am PDT
In the same video, it’s claimed that this was all put in process “a few years ago”. It appears that the “shredder” is a series of X-acto style blades which the canvas was raked over.
Meanwhile, a news post on Artsy suggests that the shred could’ve been triggered by someone in the audience with “a device in his hand”.
But what about power? In a video of the piece being removed post-shreddage, there doesn’t seem to be any wires behind the frame, nor anything plugged in. The piece itself is detailed as having been given to its previous owner by Banksy in 2006. Both the speakers in the frame and the motors of the shredder would require a power source. Keeping a battery ready and waiting for 12 years seems… unlikely.
The Sotheby’s listing for the piece notes that it was “Authenticated by Pest Control”. Pest Control is Banksy’s “handling service”, which will go out to verify supposed Banksy pieces to try to make sure no one drops a pile of cash on a one-of-a-kind Borksy. Perhaps part of the verification process involved double checking everything within the frame.
Some folks on Twitter, meanwhile, theorize that the original print could still be hidden within the frame, with what emerged having been shredded and rolled up in the frame long ago. That video Banksy posted showing the blades within the frame makes that seem unlikely… but could it be pranks all the way down?
Banksy posted a too-perfect still of the shred in process, with the caption “Going, going, gone…”
View this post on Instagram
Going, going, gone…
A post shared by Banksy (@banksy) on Oct 5, 2018 at 6:45pm PDT
(Top image left via Sothebys; Top image right via Banksy on Instagram)
After taking tens of thousands of crowd-funding pre-orders for a high-end pair of “3D sound” headphones, audio startup Ossic announced this weekend that it is shutting down the company and backers will not be receiving refunds.
The company raised $2.7 million on Kickstarter and $3.2 million on Indiegogo for their Ossic X headphones which they pitched as a pair of high-end head-tracking headphones that would be perfect for listening to 3D audio, especially in a VR environment. While the company also raised a “substantial seed investment,” in a letter on the Ossic website, the company blamed the slow adoption of virtual reality alongside their crowdfunding campaign stretch goals which bogged down their R&D team.
“This was obviously not our desired outcome. The team worked exceptionally hard and created a production-ready product that is a technological and performance breakthrough. To fail at the 5 yard-line is a tragedy. We are extremely sorry that we cannot deliver your product and want you to know that the team has done everything possible including investing our own savings and working without salary to exhaust all possibilities.”
We have reached out to the company for additional details.
Through January 2017, the San Diego company had received more than 22,000 pre-orders for their Ossic X headphones. This past January, Ossic announced that they had shipped out the first units to the 80 backers in their $999 developer tier headphones. In that same update, the company said they would enter “mass production” by late spring 2018.
In the end, after tens of thousands of pre-orders, Ossic only built 250 pairs of headphones and only shipped a few dozen to Kickstarter backers.
Crowdfunding campaign failures for hardware products are rarely shocking, but often the collapse comes from the company not being able to acquire additional funding from outside investors. Here, Ossic appears to have been misguided from the start and even with nearly $6 million in crowdfunding and seed funding, which they said nearly matched that number, they were left unable to begin large-scale manufacturing. The company said in their letter, that it would likely take more than $2 million in additional funding to deliver the existing backlog of pre-orders.
Backers are understandably quite upset about not receiving their headphones. A group of over 1,200 Facebook users have joined a recently-created page threatening a class action lawsuit against the team.
Despite surpassing analyst expectations for the quarter, Roku disappointed Wall Street when it shared its fourth-quarter earnings after the bell on Wednesday. The digital streaming business fell about 18% in after-hours trading, in the minutes following the news release. Roku posted $188.3 million revenue, when Yahoo! Finance estimates showed $182.5 million. It’s also up from $147.3… Read More
On 29-30 November, thousands of early-stage startups across Europe and beyond will attend Disrupt Berlin 2018 and spend two program-packed days exhibiting and exploring the very latest in tech innovations. In a crowd that size, it helps to have a tool to find and connect with the right people.
That’s why we’re making our CrunchMatch platform available to all Disrupt Berlin attendees. Last year, our free business match-making service connected investors and founders to discuss potential funding opportunities based on similar goals and interests. Now CrunchMatch can help everyone network more efficiently.
We’re talking founders and investors looking to connect, developers in search of employment, founders hunting for collaborators or startups recruiting tech talent — the list goes on. CrunchMatch can save you valuable time and help you make valuable connections.
Luke Heron, CEO of TestCard, has first-hand experience with the power of CrunchMatch, which he used to secure meetings with multiple VCs at Disrupt Berlin 2017. Those connections, and the relationships he built, paid off.
In a recent email, Heron told us that TestCard “just closed $1.7 million in funding (which is thanks to you and your team, bless you!) You guys are fantastic — the lifeblood of the startup scene.”
And several founders who attended Disrupt San Francisco this past September used CrunchMatch and walked away from their meetings with term sheets.
Representing the investment point of view, here’s what Michael Kocan, managing partner at Trend Discovery, said about his CrunchMatch experience.
“It makes vetting deals extremely efficient. I scheduled more than 35 meetings with startups using CrunchMatch, and we made a significant investment in one, who came to our attention through Startup Battlefield.”
Ready to simplify your networking at Disrupt Berlin? Here’s what you need to know. When we open CruntchMatch, all registered attendees will receive an email explaining how to access the platform and fill out their profiles. Your profile spells out your role and the type of connections you want to make. CrunchMatch kicks into gear and makes suggested connections and then — subject to your approval — the platform handles all the scheduling details.
Disrupt Berlin 2018 takes place 29-30 November. Still need a ticket? Buy your pass right here. We can’t wait to see you in Berlin! And be sure to use the CrunchMatch advantage — it’s the most efficient way to find your people and fuel your dream.
Welcome back to CTRL+T, the TechCrunch podcast where Megan Rose Dickey and I talk about the stories we want to talk about and connect them to the culture in which we’re all trying to live.
We first tackled the flying taxi phenomenon that isn’t really a phenomenon anymore. It’s more like we’re all going to be ducking under the near-distant hum of electric vertical take-off and landing vehicles, or eVTOLs (really rolls off the tongue), sooner than later. You see, Uber already has deals with flying taxi manufacturers, electric vehicle battery and charger manufacturers and firms that want to build the “skyports” from which these things are going to have to take off and land. And the public learned all about it at Uber Elevate.
But before we talked about Uber, we spent some time discussing the inside of Megan’s mouth. Regular readers might recall a recent visit she made to Uniform Teeth to find out about the startup’s funding round. She tried out their 3D imaging tech and received some news she wasn’t quite prepared for. And recorded the audio.
It’s a rip-roaring episode, folks, so click play below to have a listen. Or better yet, subscribe on Apple Podcasts, Stitcher, Overcast, CastBox or whatever other podcast platform you can find.
On June 6th, 2014 Kubernetes was released for the first time. At the time, nobody could have predicted that 4 years later that the project would become a de facto standard for container orchestration or that the biggest tech companies in the world would be backing it. That would come later.
If you think back to June 2014, containerization was just beginning to take off thanks to Docker, which was popularizing the concept with developers, but being so early there was no standard way to manage those containers.
Google had been using containers as a way to deliver applications for years and ran a tool called Borg to handle orchestration. It’s called an orchestrator because much like a conductor of an orchestra, it decides when a container is launched and when it shuts down once it’s completed its job.
At the time, two Google engineers, Craig McLuckie and Joe Beda, who would later go on to start Heptio, were looking at developing an orchestration tool like Borg for companies that might not have the depth of engineering talent of Google to make it work. They wanted to spread this idea of how they develop distributed applications to other developers.
Before that first version hit the streets, what would become Kubernetes developed out of a need for an orchestration layer that Beda and McLuckie had been considering for a long time. They were both involved in bringing Google Compute Engine, Google’s Infrastructure as a Service offering, to market, but they felt like there was something missing in the tooling that would fill in the gaps between infrastructure and platform service offerings.
“We had long thought about trying to find a way to bring a sort of a more progressive orchestrated way of running applications in production. Just based on our own experiences with Google Compute Engine, we got to see firsthand some of the challenges that the enterprise faced in moving workloads to the cloud,” McLuckie explained.
He said that they also understood some of the limitations associated with virtual machine-based workloads and they were thinking about tooling to help with all of that. “And so we came up the idea to start a new project, which ultimately became Kubernetes.”
Let’s open source it
When Google began developing Kubernetes in March 2014, it wanted nothing less than to bring container orchestration to the masses. It was a big goal and McLuckie, Beda and teammate Brendan Burns believed the only way to get there was to open source the technology and build a community around it. As it turns out, they were spot on with that assessment, but couldn’t have been 100 percent certain at the time. Nobody could have.
Photo: Cloud Native Computing Foudation
“If you look at the history, we made the decision to open source Kubernetes and make it a community-oriented project much sooner than conventional wisdom would dictate and focus on really building a community in an open and engaged fashion. And that really paid dividends as Kubernetes has accelerated and effectively become the standard for container orchestration,” McLuckie said.
The next thing they did was to create the Cloud Native Computing Foundation (CNCF) as an umbrella organization for the project. If you think about it, this project could have gone in several directions, as current CNCF director Dan Kohn described in a recent interview.
Going cloud native
Kohn said Kubernetes was unique in a couple of ways. First of all, it was based on existing technology developed over many years at Google. “Even though Kubernetes code was new, the concepts and engineering and know-how behind it was based on 15 years at Google building Borg (And a Borg replacement called Omega that failed),” Kohn said. The other thing was that Kubernetes was designed from the beginning to be open sourced.
Photo: Swapnil Bhartiya on Flickr. Used under CC by SA 2.0 license
He pointed out that Google could have gone in a few directions with Kubernetes. It could have created a commercial product and sold it through Google Cloud. It could have open sourced it, but had a strong central lead as they did with Go. They could have gone to the Linux Foundation and said they wanted to create a stand-alone Kubernetes Foundation. But they didn’t do any of these things.
McLuckie says they decided to something entirely different and place it under the auspices of the Linux Foundation, but not as Kubernetes project. Instead they wanted to create a new framework for cloud native computing itself and the CNCF was born. “The CNCF is a really important staging ground, not just for Kubernetes, but for the technologies that needed to come together to really complete the narrative, to make Kubernetes a much more comprehensive framework,” McLuckie explained.
Getting everyone going in the same direction
Over the last few years, we have watched as Kubernetes has grown into a container orchestration standard. Last summer in quick succession a slew of major enterprise players joined CNCF as AWS, Oracle, Microsoft, VMware and Pivotal all joined. They came together with Red Hat, Intel, IBM Cisco and others who were already members.
Cloud Native Computing Foundation Platinum members
Each these players no doubt wanted to control the orchestration layer, but they saw Kubernetes gaining momentum so rapidly, they had little choice but to go along. Kohn jokes that having all these big name players on board is like herding cats, but bringing in them in has been the goal all along. He said it just happened much faster than he thought it would.
In a recent interview with TechCrunch, David Aronchick, who runs the open source Kubeflow Kubernetes machine learning project at Google, was running Kubernetes in the early days. He is shocked by how quickly it has grown. “I couldn’t have predicted it would be like this. I joined in January, 2015 and took on project management for Google Kubernetes. I was stunned at the pent up demand for this kind of thing,” he told TechCrunch.
As it has grown, it has become readily apparent that McLuckie was right about building that cloud native framework instead of a stand-alone Kubernetes foundation. Today there are dozens of adjacent projects and the organization is thriving.
Nobody is more blown away by this than McLuckie himself who says seeing Kubernetes hit these various milestones since its initial release has been amazing for him and his team to watch. “It’s just been a series of these wonderful kind of moments as Kubernetes has gained a head of steam, and it’s been so much fun to see the community really rally around it.”
Didi Chuxing, China’s largest ride-hailing startup which claims over 550 million registered users, is deepening its focus on electric vehicles after it announced a joint venture with BAIC, a state-owned automotive giant.
‘Jingju’ — as the venture is called — is a partnership between Didi and BAIC affiliate Beijing Electric Vehicle that will develop “next-generation connected-car systems” using fleet management, AI and other tech, according to an announcement made today.
The exact scope of Jingju is not exactly clear from the details released so we’ve asked Didi for more information. We’ll update this post with more details as and when we get them.
Didi has long talked about plans to bring more environmentally-friendly vehicles into its fleet in line with efforts across China — Shenzhen, for example, has implemented electric taxis and buses. Back in late 2017, the company announced plans for its own EV charging network and, today, it claims that it has nearly 400,000 “new energy” vehicles on its platform. Didi says it clocked up 31 million registered drivers to date, so there’s obviously a lot of work to be done to raise the EV/hybrid representation.
But BAIC is an ideal partner to make that happen. Not only is it a key automaker in China but it has pledged to stop selling fuel-powered vehicles by 2025.
The joint venture is likely to tie into Didi’s existing driver services business, which helps drivers get access to services that include leasing and purchase financing, insurance, repairs, refueling, car-sharing and more. Essentially, with its huge army of drivers, Didi can get preferential rates from service providers, which means better deals for its drivers.
That, in turn, is helpful for recruiting new drivers and growing the business which is under threat because of new regulations that look set to limit the number of people who can drive for Didi.
Didi launches lending and insurance as new regulation threatens to lower driver numbers
The repeat grilling by the UK parliament’s DCMS committee today of Alexander Nix, the former CEO of the now ex company Cambridge Analytica — aka the controversial political and commercial ad agency at the center of a Facebook data misuse scandal — was not able to shed much new light on what may or may not have been going on inside the company.
But one nugget of information Nix let slip were the names of specific data aggregators he said Cambridge Analytica had bought “consumer and lifestyle” information on US voters from, to link to voter registration data it also paid to acquire — apparently using that combined database to build models to target American voters in the 2016 presidential election, rather than using data improperly obtained from Facebook.
This is more information than Cambridge Analytica has thus far disclosed to one US voter, professor David Carroll, who in January last year lodged a subject access request with the UK-based company after learning it had processed his personal information — only to be fobbed off with a partial disclosure.
Carroll persisted, and made a complaint to the UK’s data protection watchdog, and last month the ICO ordered Cambridge Analytica to provide him with all the data it held on him. The deadline for that passed yesterday — with no response.
14/ Section 7 DPA required disclosure of Axciom, Experian as data sources. SCL Election failed to disclose it. More evidence that my Subject Access Request was not adequate. This is the basis of my ICO complaint and high court claim. [break]
— David Carroll (@profcarroll) June 6, 2018
The committee questioned Nix closely over responses he had given it at his earlier appearance in February, when he denied that Cambridge Analytica used Facebook data as the foundational data-set for its political ad targeting business.
He had instead said that the work Dr Aleksandr Kogan did for the company was “fruitless” and thus that the Facebook data Kogan had harvested and supplied to it had not been used.
“It wasn’t the foundational data-set on which we built our company,” said Nix today. “Because we went out and we licensed millions of data points on American individuals from very large reputable data aggregators and data vendors such as Acxiom, Experian, Infogroup. That was the cornerstone of our data base together with political data — voter file data, I beg your pardon — which again is commercially available in the United States. That was the cornerstone of our company and on which we continued to build the company after we realized that the GSR data was fruitless.”
“The data that Dr Kogan gave to us was modeled data and building a model on top of a model proved to be less statistically accurate… than actually just using Facebook’s own algorithms for placing advertising communications. And that was what we found out,” he added. “So I stand by that statement that I made to you before — and that was echoed and amplified in much more technical detail by Dr Kogan.”
And Kogan did indeed play down the utility of the work he did for Cambridge Analytica — claiming it was essentially useless when he appeared before the committee back in April.
Asked about the exact type of data Cambridge Analytica/SCL acquired and processed from data brokers, Nix told the committee: “This is largely — largely — consumer and lifestyle data. So this is data on, for instance, loyalty card data, transaction data, this is data that pertains to lifestyle choices, such as what car you drive or what magazines you read. It could be data on consumer habits. And together with some demographic and geographic data — and obviously the voter data, which is very important for US politics.”
We’ve asked the three data brokers named by Nix to confirm Cambridge Analytica was a client of theirs, and the types of data it licensed from them, and will update this report with any response.
Fake news committee told it’s been told fake news
What was most notable on this Nix’s second appearance in front of the DCMS committee — which is investigating the role and impact of fake news/online disinformation on the political process — were his attempts to shift the spotlight via a string of defiant denials that there was much of a scandal to see here.
He followed a Trumpian strategy of trying to cast himself (and his former company) as victims — framing the story as a liberal media conspiracy and claiming no evidence of wrongdoing or unethical behavior had been produced.
Cambridge Analytica whistleblower Chris Wylie, who Nix had almost certainly caught sight of sitting in the public gallery, was described as a “bitter and jealous” individual who had acted out of resentment and spite on account of the company’s success.
Though the committee pushed back against that characterization, pointing out that Wylie has provided ample documents backing up his testimony, and that it has also taken evidence from multiple sources — not just from one former employee.
Nix did not dispute that the Facebook data-harvesting element of the scandal had been a “debacle”, as he put it.
Though he reiterated Cambridge Analytica’s previous denial that it was ever the recipient of the full data-set Kogan acquired from Facebook — which Facebook confirmed in April consisted of information on as many as 87M of its users — saying it “only received data on about 26M-27M individuals in the USA”.
He also admitted to personally being “foolish” in what he had been caught saying to an undercover Channel 4 reporter — when he had appeared to suggest Cambridge Analytica used tactics such as honeytraps and infiltration to gain leverage against clients’ political opponents (comments that got him suspended as CEO), saying he had only been talking in hypotheticals in his “overzealousness to secure a contract” — and once again painting himself as the victim of the “skillful manipulation of a journalist”.
He also claimed the broadcaster had taken his remarks out of context, claiming too that they had heavily edited the footage to make it look worse (a claim Channel 4 phoned in to the committee to “heavily” refute during the session).
But those sole apologetic notes did not raise the the tone of profound indignation Nix struck throughout almost the entire session.
He came across as poised and well-versed in his channeled outrage. Though he has of course had plenty of time since his earlier appearance — when the story had not yet become a major scandal — to construct a version of events that could best serve to set the dial to maximum outrage.
Nix also shut down several lines of the committee’s questions, refusing to answer whether Cambridge Analytica/SCL had gone on to repeat the Facebook data-harvesting method at the heart of the scandal themselves, for example.
Nor would he disclose who the owners and shareholders of Cambridge Analytica and SCL Group are — claiming in both cases that ongoing investigations prevented him from doing so.
Though, in the case of the Information Commission’s Office’s ongoing investigation into social media analytics and political campaigning — which resulted in the watchdog raiding the offices of Cambridge Analytica in March — committee chair Damian Collins made a point of stating the ICO had assured it it has no objection to Nix answering its questions.
Nonetheless Nix declined.
He also refused to comment on fresh allegations printed in the FT suggesting he had personally withdrawn $8M from Cambridge Analytica before the company collapsed into administration.
Some answers were forthcoming when the committee pressed him on whether Aggregate IQ, a Canadian data company that has been linked to Cambridge Analytica, and which Nix described today as a “subcontractor” for certain pieces of work, had ever had access to raw data or modeled data that Cambridge Analytica held.
The committee’s likely interest in pursing that line of questioning was to try to determine whether AIQ could have gained access to the cache of Facebook user data that found its way (via Kogan) to Cambridge Analytica — and thus whether it could have used it for its own political ad targeting purposes.
AIQ received £3.5M from leave campaign groups in the run up to the UK’s 2016 EU referendum campaign, and has been described by leave campaigners as instrumental in securing their win, though exactly where it obtained data for targeting referendum ads has been a key question for the enquiry.
On this Nix said: “It wouldn’t be unusual for AIQ or Cambridge Analytica to work on a client’s data-sets… And to have access to the data whilst we were working on them. But that didn’t entitle us to have any privileges over that data or any wherewithal to make a copy or retain any of that data ourselves.
“The relationship with AIQ would not have been dissimilar to that — as a subcontractor who was brought in to assist us on projects, they would have had, possibly, access to some of the data… whether that was modeled data or otherwise. But again that would be covered by the contract relationship that we have with them.”
Though he also said he couldn’t give a concrete answer on whether or not AIQ had had access to any raw data, adding: “I did speak to my data team prior to this hearing and they assured me there was no raw data that went into the Rippon platform [voter engagement platform AIQ built for Cambridge Analytica]. I can only defer to their expertise.”
Alexander Nix just stated in live testimony that what I found was not raw voter data.Here's the truth: I purposely refrained from accessing the raw databases. I found the usernames, passwords, and network locations. All out in the open.
— Chris Vickery (@VickerySec) June 6, 2018
Also on this, in prior evidence to the committee Facebook said it did not believe AIQ had used the Facebook user data obtained via Kogan’s apps for targeting referendum ads because the company had used email address uploads to Facebook’s ad platform for targeting “many” of its ads during the referendum — and it said Kogan’s app had not gathered the email addresses of app installers or their friends.
(And in its evidence to the committee AIQ’s COO Jeff Silvester also claimed: “The only personal information we use in our work is that which is provided to us by our clients for specific purposes. In doing so, we believe we comply with all applicable privacy laws in each jurisdiction where we work.”)
Today Nix flat denied that Cambridge Analytica had played any role in the UK’s referendum campaign, despite the fact it was already known to have done some “scoping work” for UKIP, and which it did invoice the company for (but claims not to have been paid). Work which Nix did not deny had taken place but which he downplayed.
“We undertook some scoping work to look at these data. Unfortunately, whilst this work was being undertaken, we did not agree on the terms of a contract, as a consequence the deliverables from this work were not handed over, and the invoice was not paid. And therefore the Electoral Commission was absolutely satisfied that we did not do any work for Leave.EU and that includes for UKIP,” he said.
“At times we undertake eight, nine, ten national elections a year somewhere around the world. We’ve never undertaken an election in the UK so I stand by my statement that the UK was not a target country of interest to us. Obviously the referendum was a unique moment in international campaigning and for that reason it was more significant than perhaps other opportunities to work on political campaigns might have been which was why we explored it. But we didn’t work on that campaign either.”
In a less comfortable moment for Nix, committee member Christian Matheson referred to a Cambridge Analytica document that the committee had obtained — described as a “digital overview” — and which listed “denial of service attacks” among the “digital interventions” apparently being offered by it as services.
Did you ever undertake any denial of service attacks, Nix was asked?
“So this was a company that we looked at forming, and we never formed. And that company never undertook any work whatsoever,” he responded. “In answer to your question, no we didn’t”
Why did you consider it, wondered Matheson?
“Uh, at the time we were looking at, uh, different technologies, expanding into different technological areas and, uh, this seemed like, uh, an interesting, uh, uh, business, but we didn’t have the capability was probably the truth to be able to deliver meaningfully in this business,” said Nix. “So.”
Matheson: “Was it illegal at that time?”
Nix: “I really don’t know. I can’t speak to technology like that.”
Matheson: “Right. Because it’s illegal now.”
Nix: “Right. I don’t know. It’s not something that we ever built. It’s not something that we ever undertook. Uh, it’s a company that was never realized.”
Matheson: “The only reason I ask is because it would give me concern that you have the mens rea to undertake activities which are, perhaps, outside the law. But if you never went ahead and did it, fair enough.”
Another moment of discomfort for Nix was when the committee pressed him about money transfers between Cambridge Analytica/SCL’s various entities in the US and UK — pointing out that if funds were being shifted across the Atlantic for political work and not being declared that could be legally problematic.
Though he fended this off by declining to answer — again citing ongoing investigations.
He was also asked where the various people had been based when Cambridge Analytica had been doing work for US campaigns and processing US voters’ data — with Collins pointing out that if that had been taking place outside the US it could be illegal under US law. But again he declined to answer.
“I’d love to explain this to you. But this again touches on some of these investigations — I simply can’t do that,” he said.