Organizations are increasingly using parallel processing technology to address their high performance, technical computing or big data analysis requirements. Adaptive Computing has long believed that its Moab HPC Suite should be the parallel processing monitor of choice.
Showing results 1 to 20 of 416
Windows 10 will build in standards-based two-factor authentication to every device, effectively neutering most phishing attacks and password database breaches. The company also announced new features aimed at securing corporate machines from malware attacks and data leaks.
Lenovo and VMware plan to collaborate to develop software-defined data center solutions.
The National Computational Infrastructure has built an on-demand, high-performance cloud computing environment to process data-intensive computations, such as climate change, earth system science, and life sciences research.
Connected Data announces a new SSD-based Drobo Mini that gives what pro users have been clamoring for: performance. You can now edit video with a redundant array not much larger than a DVD drive.
Atlassian has unveiled its new enterprise offering, JIRA Data Center, which it says provides high availability and performance at scale for teams building mission-critical software, and is designed for use in customers' own data centres.
The vertically stacked data layers promise faster performance than solid-state drives using conventional NAND flash memory, but at a higher price.
If a mix of monitoring, scoring and presenting application performance data in a dashboard appears helpful and useful, AppNeta could be the right choice for you.
As virtualisation technology spreads through the datacentre, the race is on to develop ways of sharing out data to virtual servers and desktops in large numbers.
TransLattice addresses the need for always-on, high performance data management tools that place data close to where it is used or where regulations require.
The new facility will focus on projects around Internet of Things, Big Data and High Performance Computing
The ICT industry needs to develop new standards to strengthen infrastructure and improve collaboration between service and infrastructure providers if it wants to survive climate change, according to a new report.
With two high-performance memory modules onboard, this implementation of LSI's popular MegaRAID HBA uses flash to cache frequently accessed data, boosting storage performance without the upheaval or cost of switching from magnetic disk to SSD.
New Relic, a big data and application performance monitoring company, is bridging out to analytics in a move that will make its service more consumable.
Spotfire Mobile Metrics can collate and analyse your BI data and deliver the key metrics needed to monitor business performance direct to a notebook, smartphone or tablet.
Appurify Mobile Platform includes tools for addressing performance issues and flawed design early in the mobile development process, and offers data to address them.
Concurrent, already known for its Hadoop development platform, Cascading, has introduced a management tool to help developers find the source of performance problems.
Singapore ICT regulator reiterates its pledge to develop local capabilities in data analytics and announces the appointment of the country's first chief data scientist.
Big Blue's so called X6 Architecture is designed to boost performance of x86 servers so they can better handle workloads such as big data analytics. virtualization and enterprise resource planning.
As data usage continues to grow exponentially, IT managers will need to orchestrate multiple kinds of storage — including flash, hard disk and tape — in a way that optimises capacity, performance, cost and power consumption.