SQL Server 2012

Summary: SQL Server 2012 supports everything from departmental servers to large-scale enterprise clouds. With Hadoop integration imminent there's much to be said for upgrading — if only to prepare your datacentre for big-data private clouds. Just keep an eye out for the effect of per-core licensing on your costs.

  • Editors' rating:
    8.9
  • User rating:
    0.0

Pros

  • Scales from department to cloud
  • xVelocity technologies deliver significant performance boost
  • New Business Intelligence edition adds analytic tools
  • Enhanced development tools and improved Visual Studio integration

Cons

  • Per-core licensing can mean higher costs for high-end deployments
  • Hadoop integration not here yet

Not many databases scale from in-app to cloud, but that's what Microsoft has done with SQL Server. Developers can build it into their apps, or use SQL Azure to deliver cloud-scale data. Now it's time for the rest of the SQL Server family to take advantage of the lessons Microsoft learned from building the cloud-hosted, scalable SQL Azure — and to get your data ready for the Windows Server 8 private cloud.

Microsoft has reduced the number of SQL Server editions in the 2012 release. The low-end Workgroup and Small Business Editions have been retired, along with the high-end Datacenter Edition. That leaves the Enterprise and Standard Editions, which are joined by a new member of the SQL Server family, the Business Intelligence Edition. Standard is designed for departmental databases, while Enterprise is for large-scale applications and data warehouses; the new BI edition adds a wide range of analytic tools — including the new Power View self-service analysis and reporting tools — to the functionality of the Standard edition.

Installation is quick and easy, with a set of wizards to help ensure that the appropriate prerequisites are installed, and that your system is tuned for running databases

Performance and the private cloud
Performance is important for any modern database. You'll need to deliver responses to queries faster than ever, and to many different applications across the organisation. SQL Server 2012 adds new index types to help speed up operations, especially with large databases. One approach is used for read-only workloads, like data warehouses. ColumnStore indexes add a new view on top of existing tables, allowing queries to fetch only the columns that are needed rather than an entire database. It's an approach that can deliver up to 100x speed increases — something that's increasingly important where data warehouses can have billions of rows and thousands of columns. ColumnStore indexes aren't in-memory but are fetched as required, improving the performance of the in-memory operations needed for most large business intelligence systems. ColumnStore indexes are just part of a selection of technologies that Microsoft is branding xVelocity, focused around support for in-memory databases.

SQL Server 2012 also adds improved data encryption features, with end-to-end data encryption that works locally and in SQL Azure. Access control is tied to Active Directory, and can be used in conjunction with SharePoint to secure analytic models and data. AES256 is used for keys, with SHA512 for password hashes. Integrated audit tools mean it's also easier to ensure that you're complying with all the relevant regulations for your data.

Microsoft is talking a lot about private cloud these days — witness its positioning of System Center 2012 and Windows Server 8. So it's really no surprise to find the company talking about SQL Server 2012 as another part of its cloud strategy. There's certainly a lot in the new release that builds on the cloud-hosted SQL Azure. For example, there's tooling for resource pooling, for scale (both up and out), for self-service management and for utility-like operations — focusing on support for SQL Server in virtualised environments.

Microsoft gives you multiple ways of managing SQL 2012, including PowerShell; using the PowerShell SQL Server cmdlets, scripts can be authored on your desktop and then run remotely across several servers

It's certainly an approach that many organisations will find attractive. As SQL Server has become easier to use, there's been an explosion of departmental databases built using it rather than Access. A version of SQL Server that will allow consolidation and control, while still supporting self-service operation, will give both IT departments and business users what they want. To support this, Microsoft will be releasing a set of System Center Management Packs for SQL Server 2012 (as well as  tools for building SQL Server application templates for use with System Center 2012's self-service tools). Other features that help to build a scalable SQL Server environment include new Always On features, with support for Availability Groups, and massive clusters that work across multiple datacentres. There's also increased support for Microsoft's Hyper-V virtualisation platform, with support for Live Migration and server replication. You can also scale dynamically, with the new Resource Governor controlling the memory and CPU used by the SQL Server database engine.

Microsoft will be providing documentation to help organisations deploy their own cloud platforms, as well as working with HP to deliver a reference architecture for SQL Server 2012 private clouds. There's also the option of using preconfigured HP Database Consolidation Appliances to quickly roll out cloud-scale databases in your own datacentre. A half-rack Consolidation Appliance can host around 200 databases, with the system scaling to 10 full racks. Each rack has 192 cores and 58TB of storage.

High availability and big data
Any SQL Server 2012 cloud will need to take advantage of its new high-availability features. Availability Groups are easy to put together, and can be used to protect key databases. It's also easier to build clustered databases, with support for up to four secondaries for multi-database failover. There's also enhanced disaster recovery support, with tools for handling multi-site failover (letting you host secondaries off-site using shared storage). It's important to note that there are plenty of other scenarios where these features will come into play — especially support for improved high availability. Always On databases are a key component of critical business processes, as well as a tool for supporting web services and sites. Server Core support should also increase uptime, reducing the need for downtime due to patching and giving servers a smaller attack surface.

Big databases mean big data, and Microsoft has used its work on SLQ Azure to improve how SQL Server handles large-scale databases. Instead of throwing data away, it's stored for possible future investigation using analytic tools. Part of the new approach is integration with tools like Hadoop, using a connector to deliver data from SQL Server and receive processed results for analysis.

SQL Server 2012s' Data Quality Services tools help ensure your data is clean, ready for use in business intelligence applications

Business intelligence and development
We're storing more and more data, and SQL Server 2012 is ready to handle that. But we also need to get insight from that data, to help businesses make the right decisions. That's one of the reasons why Microsoft has bolstered the analytic tools built into its database, with a whole new Business Intelligence Edition that can work with both structured and unstructured data. 

In the past, much of Microsoft's analytics approach has been centred on Excel and its PowerPivot modelling tools. That changes with SQL Server 2012, which introduces a more database-centric approach to BI. Part of this is a new semantic model for BI data, which allows you to have one model that can be used (or created) by individuals, teams or IT. SQL Server 2012's BI semantic model handles business logic, storage and use of SQL Server's VertiPaq in-memory engine. There's also improved support for the now-venerable SQL Server Reporting Services, with the new Power View visual report design tools and integration with SharePoint, as well as the ability to build alerts into BI reports and models.

Part of the SQL Server 2012 BI tooling, PowerPivot lets you build BI reports in Excel, working with massive data sets in real time

You can use SQL Server 2012's Business Intelligence Development Studio to create and manage semantic models for all your business intelligence applications and reports

Data warehouses using SQL Server 2012 get a performance boost from the new column indexes, but that's only part of its BI story. PowerPivot gets an upgrade, with new tools for more complex analytical models — including support for hierarchies and multiple relationships, making it easier to understand how data is linked. The resulting models can now be converted into applications and embedded in dashboards using SharePoint, so insights from one part of a business can become tools that help an entire organisation. IT departments also have greater control over how PowerPivot is used, while end users still design and build their own analytic models. More complex models can be built using the BI Development Studio, which integrates with Visual Studio and lets you take a project-based approach to building analytic models.

One of SQL Server 2012's new features, Power View, allows you to build interactive data visualisations in your web browser, and then share them with the rest of a team or the whole organisation

Power View is Microsoft's response to tools like Tableau, focusing on data visualisation and built using a web browser rather than as a separate application. You start by using PowerPivot to construct a data mashup from various services, and then use Power View to create interactive visualisations based on your data. There are plenty of viewing options in Power View, but it's missing some key visualisations, including mapping. Once you've created a visualisation, you can use a SharePoint library to share your Power Views.

SQL Server's Management Studio can be used inside and outside Visual Studio, to maintain, design and configure databases

Like earlier versions, SQL Server 2012 is being released with a suite of development tools — something that's becoming increasingly important as developers need to scale databases from mobile and embedded applications, to departmental, enterprise and on to the cloud. SQL Server 2012 adds new design-time tools with validation and verification, and tools for developing locally before publishing to live databases; application and data development are linked into one environment, using Visual Studio as a common IDE. The full range of SQL Server Development Tools cover everything from table design to code analysis and SQL debugging — and, of course, to database publishing. The same tools target on-premises SQL Server instances and cloud-based SQL Azure, and a local runtime means you can use familiar Visual Studio debugging features with both queries and stored procedures.

If you're using Visual Studio 2010, SQL Server 2012 will add new project options — giving you one place to build database applications, from data model to code

Pricing
One of the biggest changes to SQL Server 2012 isn't to any of its underlying technologies — it's to how Microsoft will be licensing it. These changes are partly a response to the way databases are being deployed — especially with regard to virtualisation — and to how they're being used, with more focus on business intelligence and less on pure data storage. They're also a response to the way silicon architectures have changed, and to the increasing number of cores and threads in a modern server processor. Microsoft describes it as paying for the power you need, or for the users you need: Enterprise Edition is limited to core-based licensing; Business Intelligence Edition has server plus client access licensing; and Standard Edition has the option of either the per-core or server+CAL method.

Licenses are based on a minimum of four cores per processor, with a single core licence priced at the equivalent of a quarter of the cost of the old per-processor licence. The Standard edition is limited to support for 16 cores, and you get the same limitation if you're using the BI edition as a database. However, if you're using the Enterprise edition or the BI edition as an analysis tool, you get access to all the cores that the OS supports.  Confusingly if you're using Enterprise under the old server+CAL model (for organisations with a current SA agreement that are upgrading to SQL Server 2012), you're limited to 20 cores per server.

A core can be either physical or virtual, and if you're virtualising a core-licensed SQL Server 2012 you'll need a minimum of four core licences per VM. One advantage of the new licensing model is portability, and if you're licensing SQL Server 2012 through Software Assurance you'll be able to move licences from your servers to hosted private clouds, to SQL Azure instances in the public cloud. That means you can associate SQL Server licences with your data, ensuring you've got the appropriate licences in place wherever your data resides.

Conclusion
There's a lot of change happening in the datacentre, and it's no surprise to see Microsoft responding to those changes in the latest release of SQL Server. With Windows Server 8 and System Center 2012 both focused on delivering private clouds, SQL Server had to follow. The result is a database that supports both cloud scenarios and improves reliability for more traditional implementations. The same changes, with support for improved in-memory operations, also form the foundation of its enhanced business intelligence features. Although Power View looks like Microsoft playing catch-up with Tableau, the underlying BI architecture in SQL Server 2012 simplifies building and managing analytic services — something that both Microsoft and third parties can build on.

The result is a well-rounded database, with support for everything from departmental servers to large-scale enterprise clouds. With Hadoop integration just around the corner there's a lot to be said for upgrading to SQL Server 2012 — if only to prepare your datacentre for the big-data private clouds that are on their way. Just keep an eye out for the effect of per-core licensing on your costs.

Topics: Apps, Reviews

About

Born on the Channel Island of Jersey, Simon moved to the UK to attend the University of Bath where he studied electrical and electronic engineering. Since then a varied career has included being part of the team building the world's first solid state 30KW HF radio transmitter, writing electromagnetic modelling software for railguns, and t... Full Bio

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Related Stories

The best of ZDNet, delivered

You have been successfully signed up. To sign up for more newsletters or to manage your account, visit the Newsletter Subscription Center.
Subscription failed.