Along with detailing how data is distributed across machines, the latest version of the MySQL Cluster database is billed as offering better performance and easier system upgrades.
Showing results 1 to 20 of 729
Now HTML 5 has finally been standardized, Jeff Jaffe, the W3C's CEO, wants to get people talking about what to do next. His suggestion is to build an application foundation to underpin the Open Web Platform - an operating system for the web.
Organizations are increasingly using parallel processing technology to address their high performance, technical computing or big data analysis requirements. Adaptive Computing has long believed that its Moab HPC Suite should be the parallel processing monitor of choice.
In five years, analytics cluster framework Spark has moved from a research lab to the datacentre and production workloads. Databricks CEO Ion Stoica charts its rise.
Windows 10 will build in standards-based two-factor authentication to every device, effectively neutering most phishing attacks and password database breaches. The company also announced new features aimed at securing corporate machines from malware attacks and data leaks.
Lenovo and VMware plan to collaborate to develop software-defined data center solutions.
The National Computational Infrastructure has built an on-demand, high-performance cloud computing environment to process data-intensive computations, such as climate change, earth system science, and life sciences research.
Apple lays out the reasons why it could reject new apps that use its HealthKit and HomeKit framework.
No, developers will not be allowed to sell your health data for targeted advertising.
Having written IE11 to support web standards, Microsoft is shimming it to handle mobile websites that are not following standards but catering for webkit quirks and/or non-standard features in Apple's Safari on iOS.
Connected Data announces a new SSD-based Drobo Mini that gives what pro users have been clamoring for: performance. You can now edit video with a redundant array not much larger than a DVD drive.
Atlassian has unveiled its new enterprise offering, JIRA Data Center, which it says provides high availability and performance at scale for teams building mission-critical software, and is designed for use in customers' own data centres.
The vertically stacked data layers promise faster performance than solid-state drives using conventional NAND flash memory, but at a higher price.
If a mix of monitoring, scoring and presenting application performance data in a dashboard appears helpful and useful, AppNeta could be the right choice for you.
Continuuity is on a mission to make big data application development accessible, powerful, fast -- and enjoyable.
As virtualisation technology spreads through the datacentre, the race is on to develop ways of sharing out data to virtual servers and desktops in large numbers.
The idea that the internet generations don't care about privacy is a myth — so talk to them and find out the truth before collecting their personal data.
TransLattice addresses the need for always-on, high performance data management tools that place data close to where it is used or where regulations require.
The new facility will focus on projects around Internet of Things, Big Data and High Performance Computing
The ICT industry needs to develop new standards to strengthen infrastructure and improve collaboration between service and infrastructure providers if it wants to survive climate change, according to a new report.
The best of ZDNet, delivered
- 1 33 ways to improve your iPhone's battery life
- 2 Perfectly legal ways you can still get Windows 7 cheap (or even free)
- 3 How much does an iPhone 6 really cost? (Hint: It's way more than $199)
- 4 Seven privacy settings you should change immediately in iOS 8
- 5 So you have an app idea and want to make a bajillion bucks