X
Business

Microsoft details plans for Visual Studio and .NET

Visual Studio chiefs Jason Zander and Matt Carter set out the company's manifesto for data democracy
Written by Adrian Bridgwater, Contributor

In the wake of the recent PDC and TechEd developer events, Microsoft has decided to put some of its key executives out on the road to explain the innovations that Visual Studio 2010 and .NET 4.0 have in store.

Microsoft is promoting the next version of its Visual Studio toolset, code-named Rosario, as offering new levels of analysis of the application development process.

On the back of a well-rehearsed pledge to democratise the application lifecycle management process, the company is hedging its bets with a set of product enhancements it says will meet the software development needs arising from trends such as virtualisation, cloud computing and parallelism.

Attempting to shed light on the forthcoming tools with a visit to the UK were Redmond-based Jason Zander [pictured, right], general manager for Visual Studio, and Matt Carter [left], group product manager in the same division. ZDNet UK caught up with them both at Microsoft's London headquarters in Victoria.

Q: What's the core technology proposition for the new tools and how will the new releases simplify everyday development tasks?
Carter: with VS2010 [Visual Studio 2010] there will be a strong focus on providing insight into the development process in terms of the structure and function of the code. We're also concerned with making it easier to build web applications. We want to encourage the development of departmental business applications that utilise the Office UI and we want to make SharePoint development feel like Visual Studio development so that usability is improved.

We also want to reach out to C++ developers if they have a large investment in terms of lines of C++ code, so they can now carry those forward into a Visual Studio environment. There will also be evidence of our investments in Visual C++ to simplify development of native Windows 7-based applications, and this will mean support for innovations compliant with Windows 7, such as multitouch user interfaces.

Specifically, how will developers be able to work competently with increasingly complex applications if they adopt the forthcoming tools within the .NET 4.0 framework?
Zander: If you learn a language like C# or Visual Basic and you learn a framework like .NET and how to program against it and combine that with Visual Studio, then those three things together provide a very consistent environment for working towards numerous platforms that you may want to target — complex or otherwise.

This is already the case now, but it will be more so when .NET 4.0 arrives.

If we need the integration elements of Rosario so pressingly, why have the so-called development silos you often talk about developed to such a degree? Surely this segmentation has developed through the use of much of your existing technology.
Zander: In a big enterprise there will always be multiple tiers of development with externally facing elements sitting alongside internal business management needs, so silos will always exist to some degree. What we need to look at now is a situation where, let's say, a procurement department needs to build in a new external web service as well as form tighter links to the rest of the business. What we're trying to do with our tools is make sure the programming for those different segments — and, crucially, being able to stitch them together — becomes a simple task.

Your marketing people are fond of saying VS2010 will "democratise application lifecycle management from architects to developers, to project managers to testers". Where's the substance for that kind of statement?
Carter: The substance, for us, comes from the information share and insight improvements we've made. We've looked very hard at the problem of non-reproducible bugs — when a tester tries unsuccessfully to replicate reported defects. We have a new test tool that allows a developer to view a screen-captured video of the defect as recorded by the tester. At the same time, the developer...

...can also view the historical debugging information and machine state at the time of the problem. By lowering barriers and making sure everyone works from the same repository of information, you get a far greater sense of a team and that, for us, represents democratisation.

Visual Studio Team System [VSTS] 2010 architecture is claimed to bring non-technical users into the modelling process to define business and system functionality. How do we keep business managers reigned in to keep their requirement specifications under control?
Carter: It's all about transparency. Through VSTS we will aim to try and make available all the reporting and business intelligence necessary for business users to be able to view the status of a project. So if that reporting exists and is delivered to business users via tools they're used to, such as Excel and Outlook, it must represent a positive addition to the project at hand.

Your next Windows Azure tools are aligned to development for the cloud. How will they look and feel in practice?
Zander: We want to make it possible for developers to use all their .NET programming skills for the cloud. There will be a sandbox security model similar to that which we have provided with the ASP.NET web application framework. The best practices you can find with that technology will also extend to Azure on the cloud.

Carter: With Azure, the key thing is everything will look very familiar to you as a Visual Studio developer, because the programming model is the same. With the same components at hand, we hope developers will see a movement to the cloud as a natural and evolutionary extension.

What tools do you have to help developers with the techniques chip developers say are necessary for multicore?
Zander: In terms of VS2010 and parallel computing, there is a new set of libraries specifically built to enable to developers to write parallel code. At the base level we have a new runtime called the concurrency runtime, which allows me as a developer to take advantage of all the cores present on the machine. Secondly, the tooling inside VS2010 will be enhanced so both the debugger and the profiler are able to track all the extra work you're scheduling for the machine and see how well it is executing.

You're making a big play for web developers with the new products. Other than full support for Silverlight, which we would have expected, what else is new?
Zander: Of course, it's more than just Silverlight support, but as we head towards version 3.0 that will be important. With the new products we have incorporated new model-view-controller (MVC) patterns and we're also shipping the jQuery JavaScript library with VS2010 including full IntelliSense support for auto-completion functions.

VSTS 2010's testing and debugging features have been described as a black-box recorder to help eliminate non-reproducible bugs. Do you think you'll 'eat your own dog food' and improve your own beta releases with this technology?
Zander: Absolutely. One of the sessions people will have seen at PDC and Tech Ed delivered by Stephanie Saad was designed specifically so she could document all the instances where Microsoft is 'dog-fooding' on the development of VS2010 and VSTS. It is used internally right across teams like the Microsoft Office division where thousand of developers will be contributing code at any one time. In fact using 'dog-fooding' as a verb in this way has been the norm at Microsoft for some time now. We're pretty comfortable with it.

Editorial standards