The three new offerings build a sort of cloud-as-you-go strategy, allowing users to move applications and processes to the cloud whenever it makes the most sense.
Showing results 1 to 20 of 1,268
Keeping applications reliable and available is a key challenge for most data centers. Veeam believes its Availability Suite V8 is the best answer but it depends upon if you rely solely on x86-based solutions.
Deploying complex, high performance applications both locally and in a cloud service provider's network can be very challenging. DivvyCloud believes that it has the solution to this problem.
Australian emergency service agencies want access to GPS data on phones to locate callers in emergencies, but Telstra has said it is up to users to ensure that GPS applications are turned on.
The challenge that developers face when developing programmatic access to services and applications is that creating and documenting rational APIs is difficult. Apigee has packaged up all the needed skills into a product called Apigee Edge.
New instances launched across seven regions deliver the highest level of processor performance on EC2, aimed at high-performance computing applications.
Redmond's latest salary survey shows employers paying 25 percent more and up for Azure and big data work.
Following up on Canada and London, IBM has tapped Germany to host a SoftLayer data center.
Sponsored by Red Hat and Intel
Many enterprises have commenced deployment of hybrid cloud models, receiving the dual benefits of public cloud offloading of infrastructure concerns and private cloud safeguarding of data.
The Australian Taxation Office is seeking applications for a new multi-use list to support the agency's foray into big data.
With great potential comes great responsibility: Here's how companies need to think about the Internet of Things to generate the greatest benefits.
This gorgeous atlas of a modern city does what most infographics only aspire to: it takes a vast amount of information and makes it clear and understandable.
VoltDB's co-founder and chief strategy officer discusses the growing requirement for in-memory databases to address the demand to gather, analyze and make sense of operational, machine, social media and other types of data. Do you agree?
A whole-of-government multi-use list for vendors of data analytics tools and services to be opened for application requests in early January will play a central role in the ATO's data and analytics program, set to kick off next year.
Hitachi Data Systems has predicted that in 2015, the market will see enterprises deploy their critical legacy applications into a hybrid cloud infrastructure.
HP says the iteration allows users to tap into key components of the HP Haven Enterprise platform within minutes.
Organizations are increasingly using parallel processing technology to address their high performance, technical computing or big data analysis requirements. Adaptive Computing has long believed that its Moab HPC Suite should be the parallel processing monitor of choice.
A land and expand strategy is leading to strong growth for Splunk, best known for its machine data monitoring software.
The internet of things could be very helpful to organizations or bury them under the weight of operational data from a herd of devices, their software, and their applications. Glassbeam believes its SCALAR is the tool that will make sense of it all.
CA Technologies has advised that in order for enterprises to deal with the move to the application economy, they need to prepare for the unwired enterprise, ambient data, and API-assembled applications.
The best of ZDNet, delivered
- 1 Windows 10: You've got questions, I've got answers
- 2 Hands-on with Windows 10: Installing the latest Technical Preview
- 3 Perfectly legal ways you can still get Windows 7 cheap (or even free)
- 4 31 ways to improve your iPhone's battery life
- 5 How much does an iPhone 6 really cost? (Hint: It's way more than $199)