Palantir buys Kimono Labs, cloud-hosted service to close

Kimono's cloud-hosted webscraping service will shut down following its acquisition by Palantir.
Written by Steve Ranger, Global News Director

Data analytics company Palantir has acquired webscraping startup Kimono Labs, which will now close down its publicly available cloud service.

Kimono is a browser-based tool that allows users to extract data from web pages without requiring extensive programming skills. It can be used to create apps and sites such as price comparisons and sports analytics.

The company, which launched two years ago, said its product is used by over 125,000 developers, data scientists and businesses.

In a note on its website, Kimono said: "We've realized that continuing to work in isolation on a general data collection tool simply won't allow us to make the impact we want," adding that Palantir offers the support, resources and "the ability to work on things we could not tackle alone as a small startup".

The Kimono team said that "because of our new roles at Palantir" it will be shutting down its publicly available cloud-hosted Kimono product on 29 February.

"From that point forward users will no longer be able to log into kimonolabs.com services or access any data via the website or API endpoints," the company warned, although it is making a lightweight desktop version of Kimono available "to ease the transition".

Kimono for Desktop, for Mac OS X and Windows, will offer a similar experience as the web-based product at kimonolabs.com, but will not be cloud-hosted: the software will integrate with a new version of the Chrome extension and perform all data collection on the PC.

"The software download will be free of charge and will function entirely independently from the kimono platform -- and will be provided as is, without continued active development and without technical or customer support," Kimono said. APIs and data will not be accessible after the shutdown date, so there's a 30-day window -- until 31 March -- for customers to import their APIs into the desktop application to continue extracting data.

More on big data

Editorial standards