WWDC 2013 to announce iOS 7, OS X 10.9 previews: What else?

WWDC 2013 to announce iOS 7, OS X 10.9 previews: What else?

Summary: UPDATED (June 3, 2013): Apple's developer conference will unveil many new product announcements, with a general focus on software. What should we expect to see this month when Apple's Tim Cook and others take to the stage in California? Here's what we think.

SHARE:
TOPICS: Apple, iOS, iPhone, iPad
61

 |  Image 17 of 19

  • New Mac Pro due soon; perhaps a WWDC debut?

    Apple chief executive Tim Cook told one Mac Pro user that as it still considers its tower box-using customers as "really important" to Apple, 

    This email, verified by Macworld, was sent in June 2012. This was around the time of WWDC 2012 last year. Cook added he should not worry as, "we're working on something really great for later next year."

    Sister site CNET reported in April that the new Mac Pro may be ready by April, but there was no such announcement. Perhaps Apple was waiting for the new Intel "Haswell" chips to be finalized? If so, a new Mac Pro tower box could be on the cards for WWDC 2013.

    Unfortunately, European customers will not benefit as Apple opted to stop selling the tower boxes in the 27 member-state bloc in March after the European Union passed a new stringent electrical regulations law.

  • Apple TV updates in preparation for an ecosystem push?

    Again, it's very unlikely that Apple will announce a television at WWDC. 

    The company could of course release a brand new Apple TV device to complement the opening up of its development platform for the set-top box. This allows developers to hook into the platform further so that once it reaches a healthy ecosystem of apps and services, it could then begin to cater for that audience. 

    The Wall Street Journal also suggested that an updated version of Apple TV could be on the cards to erase the distinction between live and on-demand television content.

  • iLife, iWork updates expected

    It's very possible that Apple's productivity and creative suites will also be updated in time for a WWDC 2013 release. In its note to the press, Apple specifically noted "iLife, iWork and professional software" designed for the Mac. Because WWDC is typically a software-oriented event, it's entirely likely that the two flagship Mac-based products will also see a major refresh. 

    iWork and iLife are available both on Macs and iOS-based devices, again bridging the divide between mobile and the desktop, thanks to its in-built iCloud support.

  • Thumbnail 1
  • Thumbnail 2
  • Thumbnail 3
  • Thumbnail 4
  • Thumbnail 5
  • Thumbnail 6
  • Thumbnail 7
  • Thumbnail 8
  • Thumbnail 9
  • Thumbnail 10
  • Thumbnail 11
  • Thumbnail 12
  • Thumbnail 13
  • Thumbnail 14
  • Thumbnail 15
  • Thumbnail 16
  • Thumbnail 17
  • Thumbnail 18
  • Thumbnail 19

Topics: Apple, iOS, iPhone, iPad

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Related Stories

Talkback

61 comments
Log in or register to join the discussion
  • Hopefully...

    They won't pull a "modern UI" on OS X. I can't really see the use of touch on a desktop (even more so considering my iMac is past arm-length! :))
    MichelR666
    • I agree.

      I don't want touch screens on any of my desktops. If they're going to change anything, I'd like to see application menus attached to their windows, so the interface is more multi-monitor friendly.

      I also don't understand why they need to change the look of apps to look less like the real objects. I think the fact that the built-in applications looked like real-world objects is part of what made iOS approachable and familiar for the masses. It was a key to the crazy adoption rate.

      I'd be more excited about Siri and Maps if they'd just fix them. I've used Maps, in particular, a total of three times and two of those times it sent me to a completely wrong location which was pretty far away from where I needed to be. Needless to say, I haven't used it since. And Siri is just a mess of "I don't understand the question." Plus, the room has to be dead silent for it to work. I don't think I'm ever in a room which is dead silent.

      Then there's iCloud. I remember the original introduction showing a photo taken on your iPhone automatically appearing on your iPad 60 seconds later. What a crock. I've taken photos on my iPhone that didn't appear on the iPad until a day or two later. A few never appeared. It's buggy, undependable, and broken. The control over notifications on shared calendars is non-existent. Everyone gets a notification or nobody gets one. Even music I've purchased doesn't always appear on all of my devices. If you can't depend on it working as advertised, it's pointless to have it.

      Lastly, iMessage is the biggest joke in the world. It tries to send everything as an iMessage first. Most of the time, it introduces a big delay which doesn't happen when you send as a text message. Half the time, it won't send because you have WiFi turned on but the signal it detects is too weak, and yet it doesn't automatically fall back to try to send it as a text message when it fails. If you turn iMessage off so that everything is sent as a text, you can't receive iMessages from others who still have it turned on. There is no indication that you didn't receive the message, either. They need to set it up so that you can select how you want to send and how you want to receive separately. It's as annoying as standing in a swarm of mosquitoes.
      BillDem
      • NO!!!!!!!

        Do not EVER attach menus to applications. What a horrible idea. Instead, spread the global menu across all your monitors. Attaching menus to applications is a HORRIBLE interface convention and is one of the reasons Windows users work the inefficient way they do (running every app full screen).

        In fact, I think it has been a stupid move for Apple and other software vendors to lock toolbars into document and application windows. You want awesome multi-monitor support, go back to the days when all your controls were on their own floating windows. Power users with multiple monitors would put their document on one monitor, and all their controls in their various palettes on a second monitor. It was an absolutely incredibly awesome way to work.
        baggins_z
        • YES!!!!!!

          One of the very few places where the Windows experience is better is that fact that application menus are attached to the application windows. I can keep my focus in the work and don't have to go digging around on my other monitor to make a tweak to the application. I don't have any problem with tear-off menus that I can put anywhere. But I'd also like the default to be with the application. It makes ergonomic sense. The current Apple approach doesn't.
          MC_z
          • No!

            Ergonomic sense? Um, not. You can't just say things and magically make them true, and using multi-syllable buzzwords like "ergonomic" doesn't increase their veracity.
            There is a little thing called Fitt's Law, that is at the heart of all UI science. Fitt's Law is quite clear in regards to applications menus attached to Windows: it is a stupid idea that drastically decreases efficiency by slowing down the time it takes to select menu items. This is not a debatable point. It has been studies over and over again, with actual data points based on actual users.
            This is leaving unmentioned the numerous other issues, such as wasted screen real-estate used to replicate the menus on every window of an application. If your only complaint is not having a menubar on the your currently used monitor in a multi monitor setup, countless utilities exist to put the menu bar on every monitor. There is no reason to tai a step backward in user efficiency by regressing to the Windows route.
            .DeusExMachina.
          • Minus the stupid typos

            .DeusExMachina.
          • Baloney

            Any time you have to rotate your head from one large monitor to another to remain in the same task is bad design. It's like having the steering wheel on one side or your car and the brakes on the other. Don't know who Fitt is and don't care.
            MC_z
          • If you don't know Fitt's Law, you aren't competent to comment on UI issues

            And no ones said anything about having to switch monitors. Nor is there any truth to the idea that having to "rotate" your head automatically means bad design. Such things are measurable, and I guarantee you that I can hit a menu item three monitors distant after than you can hit a menu target on a focused window in the current one.
            .DeusExMachina.
          • after=faster

            .DeusExMachina.
          • Do you even listen to yourself?

            You "guarantee" you can turn your head and hit a menu several screens away faster than we can hit one that's virtually under our cursor and currently visible? Apparently YOU don't' know Fitt's Law.

            Since you and Bagginz are so ignorant, let me quote Fitt's Law, as stated in the opening paragraph on Wikipedia,
            "[Fitt's Law] ...predicts that the time required to rapidly move to a target area is a function of the distance to the target and the size of the target."

            So your statement, based on Fitt's Law, which you admire so much, is impossible.

            Fitt's Law clearly shows that it is OS X, which is poorly designed, when used across multiple displays. Windows 7 adheres to Fitt's Law. OS X does not.
            BillDem
          • And… thanks for demonstrating that you know crap all about UI design

            Fitt's Law clearly shows nothing of the kind, which you'd know if you actually understood it, rather than cutting and pasting definitions you found online. For instance, once of the key corollaries to Fitt's Law is that the size of items on screen edges are essentially infinite in the dimension of that edge. So, for instance, an item affixed to the top edge of a screen, say, a menu item, has a vertical extend that equals infinity. This is because the user can not overshoot that screen edge, as the cursor is essentially stopped, and as such there is no motion damping as the user slows down to prevent over shoot, often overshoots anyway, especially if they did not slow down enough, and no need for corrective backtracking on missed items, since, again, the cursor motion is stopped by the screen edge. The upshot of this is that a person can simply let their cursor fly at top speed toward the top of the screen and hit the menu with high precision and very high accuracy. This is in contrast to window-centric menus, where the user has to first find the menu (whose horizontal position is not fixed to screen boundaries, but varies with Window position) move the cursor toward that menu target, slow cursor motion down as they approach, and backtrack to correct overshoot, often more than once, then finally click to activate.
            Empirical data has shown TIME AND TIME AGAIN, that the fixed menu bar is faster, and a more efficient interface.
            So yes, I can guarantee that I can hit a menu three monitors away faster than you can hit a window-based menu on the same monitor, almost 100% of the time.

            You need to check your own ignorance before you start making a fool out of yourself calling others out in print.
            Seriously, you don't even have a superficial understanding of Fitt's Law, let alone an appreciation for the subtleties and nuances and their ramifications for UI design.
            .DeusExMachina.
          • Only a 6% market share OS would try that smokescreen...

            Apple must obviously do things better because everyone prefers OS X on their desktops, right? Oh wait, they don't. Only a tiny minority use it.

            I'm sorry, but there is nothing logical about having your application menus on a different screen than your application. Even when you force OS X to run the top menu across all three monitors, your application menus show up at the FAR LEFT of the first screen while your windows might be on screen 2 or screen 3. It is also illogical to have an application menu on every window of an application, as you described in that research. Fortunately, Windows 7 does neither of those things, which is why 90% of people use it versus the 6-7% using OS X. Windows puts the menu on the MAIN application window. Toolbox windows are still arrangeable across several monitors and each windows does not have menus. The research you cited compared Apples to rotten Apples, not Windows. Menus on every application window would definitely be retarded. Fortunately, neither Windows nor OS X do that. Owning and using both OSes on multiple monitor setups, I can say that I absolutely prefer the Windows 7 UI over OS X, even more so now that OS X wants to be IOS. Using many apps at once on multiple monitors is simply tedious on OS X when compared to Windows 7. Then again, I've never run Windows applications full screen, either.
            BillDem
          • And apparently you don't know crap about Windows, either

            First, lket's address your OSX ignorance. No, the utilities that extend the menu across all monitors do NOT have the menus only show up on the far left. WTH point would that serve?!? Do you even bother to think before you post?!? The utilities in question either reproduce the menu bar on all monitors, or place the menu on the monitor of the currently in focus window.
            Second, yes, it is illogical to have the menu reproduced on every window, but that does NOT mean that many, if not most apps, don't do EXACTLY that. Your nonsense about MAIN application windows is just that, nonsense. For instance, I have five documents open in Word. Which, pray tell, is the "main" application window? The scenario you describe is with UIs that have one document open, and satellite pallet windows. This is hardly the norm. So no, that is NOT "how Windows 7 does it". And the ribbon interface makes this even worse. Sure, they try to get around this with a tabbed interface instead of windows, but as soon as you spawn a new window, which is needed many times, such as side-by-side editing of documents, you get all the menus and ribbons replicated.

            As to your nonsense about this being why people prefer Windows, you saying so is hardly proof of people's motivations and preferences. People use what they know, and what they know is dictated by what they are exposed to, and what they are exposed to, especially in the early days of computing, before most people had their own personal machines, was what they had access to at work. That was Windows, via IBM, and that started a ball rolling that has not stopped. Windows marketshare has NOTHING to do with people's evaluation of the UI issues and making informed choice. Most people have never even seen OSX (or Ubuntu or any other OS for that matter) to be able to even make such a comparison, let alone an informed one.
            .DeusExMachina.
    • OS X has already taken a "light merger" approach

      Apple has brought in the Launcher, which looks like iOS, but it is pretty unobtrusive and can be left on the dusty skeuomorphic bookshelf.

      Despite the complaints from Snow Leopard holdouts, I find Apple has been very conservative about change - nothing's been forced on us we can't undo in settings (like the trackpad settings.)
      Mac_PC_FenceSitter
      • Not quite true

        A large number of changes from SL can not be changed back in settings, and detract from the user experience. Black and white icons in the sidebar, for instance, the behavior of QuickLook relative to other windows, for another.
        And nothing about Launcher is skeuomorphic.
        .DeusExMachina.
      • Wrong.

        I hate to correct you Mac_PC_FenceSitter, but there are a lot of things you could do in Snow Leopard, that were completely removed in Lion/Mt. Lion. Take "Spaces" for instance. I have a 2010 iMac and I still have Snow Leopard on it because I use the Spaces feature a lot. It does, as Apple used to say, "just works".
        nevertell
    • Jonathan Ivie

      Gets design, so I think he probably understands that WIMP and Touch are fundamentally different ways to interact with a computer. And, if not, all they have to do is look at the visceral dislike Windows 8 is receiving in the market right now.
      baggins_z
  • Urgh

    Here we go again. Apple announces their Developer Conference and everyone and their mother starts to speculate. As if Apple would announce a Watch and a Radio or a TV at a developers conference.

    Microsoft will show their new Surface and a Surface Phone and new Watch at their Build conference. Just because I want to speculate too. And Google will launch a new Nexus phone at IO.
    Dreyer Smit
    • Developers first, then users

      Of course they would announce new hardware at WWDC--especially if it's a new platform (TV, watch, etc). The success or failure of the hardware platform is based on a robust universe of apps available to do cool things. The developers need to "buy in" to the hardware, then build products for it. If I am not mistaken, it is not the first time they have teased new hardware at WWDC. It may not be all the fanfare of a consumer launch, but letting the cat out of the bag as to "this is coming, here it is, and here it the SDK for it."

      That said, I think a watch is a terrible idea, and will be a major flop.
      IllinoisMike
  • Will it matter?

    Tim Cook is no Steve Jobs when it comes to the dog and pony show the Apple fans fell in love with.
    NoAxToGrind