More Intel chips coming to ZTE phones

More Intel chips coming to ZTE phones

Summary: ZTE signed a strategic cooperation agreement with Intel last week to equip its next-gen Android smartphones with the latest Atom processors.


Chinese network equipment and cellphone maker, ZTE, will equip its upcoming smartphones with Intel's newest processor Atom Z2580, according to a report on Chinese tech site QQ Tech.

Comparing with its predecessor Z2460, Intel's new hyperthreading, dual-core processor is said to have doubled performance in computing and tripled in graphic, while still maintaining a competitive battery life. 

The deepening cooperation between ZTE and Intel was established especially by the steady expansion of ZTE in the European market. Grand X IN, ZTE's first phone with an Intel chip, was the best selling Android phone in Austria. The model is also available in other EU countries such as Germany, Poland, Hungary, Romania, Greece, Sweden, and Norway, among others.

"Grand X IN was the top-end product of ZTE in Europe, [and] we successfully launched the model thanks to our close cooperation with Intel," said Ao Wen, general manager of ZTE's Europe mobile unit. "We look forward to future cooperation with Intel and its support for ZTE's development in the high-end smartphone market."

"The collaboration with ZTE made a positive impact on Intel's newly designed smartphone processors," said Hendrik Unkel, Intel's marketing and business development director. "We believe that high performance, competitive battery life, and their combination with the whole brand value will be recognized by the customers."

Topics: Processors, Intel, Smartphones, China

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.


Log in or register to join the discussion
  • Wonder Why They Don't Do Windows Phone?

    Oh, wait --Windows Phone doesn't run on Intel chips...
    • It could do.

      They designed it to be easily ported, and neither MS nor Intel have ruled it out. Intel have even declared themselves 'open' to it. And MS have said it's possible.

      For now it seems their mobile chips are probably better suited to tablets, but there will always be a niche for all types of phone. Haswell should be interesting for intel in the pro tablet area if they hit their claims of haswell architecture under 8w.
      • Re:

        "They designed it to be easily ported" back in 1993, and look at how well that turned out: every single non-x86 port is now defunct.

        No, you're never going to see a mobile Microsoft OS on an Intel chip. Proprietary OSes are just too expensive to port.
        • Apple'd disagree

          Microsoft may as well... RT.

          Actually historically open source has had more issues porting; trust me I'm a Linux guy that owned a g4!

          Proprietary software is easy; you just pay someone to do it. Just because there is always an open source project to get x running on y does not mean it is easy or successful.

          However windows phone (instead of windows mobile) is a port of the NT kernel TO ARM. It is most definitely possible to 'port' back to X86

          In the past, porting an OS was a pain because all your binaries would stop working; apple went intel... No new software was able to run on my g4.. Boo hoo.

          However that was then and this is now. Just like windows 8 apps run on pro and rt, but windows binaries for desktop and previous versions only run on pro... Well windows phone 8 apps can run on a ported OS without the need to all be recompiled by their developers.... Just like the intel android devices.. Without java.
          • Re: Actually historically open source has had more issues porting

            That's why Linux can run on 2 dozen major processor architectures, including your PowerPC G4. That's more than any other OS in history.

            That's why Android already runs on 3 different architectures.
          • Utter lack of knowledge...

            Distro's supporting 2 dozen architexture's packages please?

            Even PPC. Has been being dropped like a stone in recent years. Due to he effort required to maintain it. Even canonical booted plc out to a community effort pretty soon after apple'd switch.

            The king of porting is NetBSD not gnu/linux. You can port ANY OS to any platform. The required steps have nothing tobdo with open or closed source, however you argued that proprietary OS don't get ported... Well I gave you two examples that have gone flawlessly; win8 to arm, Mac OS X to x86, plus windows nt kernel to ARM. For phones.

            I'm an open source guy through and through. But I happen to believe exposing your weaknesses forces you to address them; pasting over the cracks weakens the wall. Anything can be pled, but maintaining those ports has crippled many a distro's development; it is far more 'expensive' to a distro as you device your resources as opposed to a proprietary OS that has a central strategy and costs it up prior to the port.

            Please reassure me that your case below is not an argument for desktop android?
          • Re: Distro's supporting 2 dozen architexture's packages please?

            Debian does about a dozen. Including ARM and PowerPC. Even MIPS and SPARC. Oh, and IBM S/390 mainframes.

          • 2 dozen.

            Yes, I've been using Debian since potato.

            My point was that there are no distro's supporting 2 dozen architectures for exactly the reason I listed above. Most porting in Linux occurs due to the need to run on servers and mainframes. This is easier as you don't need anywhere near as many meta packages porting or GUI packages.

            If you look into the repositories for those you will find diminished package counts. Porting all debians 30 odd K of packages is where the extreme expense to their resources comes in.

            Put it this way Debian puts in all this work developing desktop for x86 PPC and ARM (you'll be hard pressed to find many SPARC home/office machines still able to run desktops... Maybe an old SPARC server on ebay.) it is the top of the stream. 10's or maybe hundreds now of distro's are down stream either directly or via ubuntu. Why do so few support PPC? Most of the leg room has been done. It's because of resources. How many projects are able to co-currently release both their primary and alternate desktops, let alone architectures.

            Your point was that it is easier and cheaper for open source to port an OS. It isn't. As I said it is actually harder as YOU have to port all the supporting packages, not their developers, and it is not cheaper because it costs a lot more in resources. It is far more likely to happen because open source allows the users of the software to dictate it's use, but this is had earned. Freedom comes at a price; there's no capital city or admiral of the fleet. No jobs to say we're going intel people, or ballmer to say our 'apps have to work on intel and arm chips.. Make it so number one.'

            To be honest I don't get the argument here. I've personally owned three PPC machines between 2001-20010. I'm well aware of the campaign, struggle to get distros and flavours running on that hardware. And how quickly support then ebbed away. The reliability of updates and which packages would be available. It was great that these old machines didn't hit the tip when their developers dropped them, and all the range of things you could do with them. But it can through hard work, and many a time compiling from source.

            Open source software kept those machines running, made them customisable, gave choice and unlocked their power. But it wasn't clean or pretty at times. I am incredibly greatful to all who contributed, but lets not make light of their achievement by saying it was easy.

            With regards to arm, yes they are all working on porting to arm. However what is needed is again an ability to run packages independently of hardware through a compatibility layer/runtime environment/virtualisation... Whatever is most efficient. If (when) ARM comes to the desktop, having to again port whole OS's and repositories will severely slow down development by dividing resources.
          • Re: it is actually harder as YOU have to port all the supporting packages,

            No, that's so much easier. For example, look at Raspbian: the Raspberry π runs an older ARM architecture, but it does have hardware floating-point. But the standard Debian ARM port doesn't support that unusual combination. So a couple of volunteers took it upon themselves to recompile all 15,000 standard Debian packages for the π. It took them about 6 weeks. And now that's become the recommended distro for the π.
          • Sorry I should have been clearer I meant third party software.

            As in once you have ported your system, you have to port all the software your operating system will run.

            What you've given me is an entire project about getting Debian armhf (already ported) running on raspery pi. Be honest this "couple of people" weren't getting 15000 x86 binaries running on arm in 6 weeks were they?

            It is harder to port all the third party software yourself. The project is also not finished. Neither is the arm port of Debian. Windows is. Android is. iOS is.

            They ported the 15000 base packages. That's nothing to do with open source. I'll even stretch to propose that a dedicated team would have finished faster than the volunteers.

            You have yet to provide any example of porting being easier due to open source licencing, that windows phone couldn't run on intel chips or a smoother port than OS X, or windows 8 to RT.

            Again. You get more choice in open source, but there's always a compromise.
          • Re: Sorry I should have been clearer I meant third party software.

            It's usually quite easy to build third-party software, even if your distro provides no package for it. The usual commands (once you have downloaded the source) are

            configure && make && sudo make install

            (Or if not, there should be an INSTALL file explaining what to do.) I've done this for lots of third-party packages. Also handy for ones included with the distro, if for some reason the distro package is not quite the latest version, say.
          • So open source is easier to port because

            You get all your users to compile all your binaries for you?

            Okay okay I think we've pretty clearly cover all this now. Installing from source is a rediculous solution. Tats going back to the very earliest days of Linux. Trust me as I said two posts ago I've been all through that two posts ago, so no Linux isn't easier to port, because you have to either port all your repos or distribute source code.

            Anyway that aside,
            I would advise against installing lots of software by source. For one if the developer doesn't include uninstall scripts/buggy scripts gettin rid of it is a total pain.

            Secondly when you get to metapackages chasing 10's of dependencies several levels back will drive you mad.

            I'd advise looking up binary packages wherever possible to hand over to your packagemanager or use checkinstall to make a binary for you. As a general rule it's better to use the tested binaries unless you plan to edit the makefile or other Config files. I'd also advise against linking your commands with && so you can see the output easier.

            I assume you are using ubuntu due to the sudo command, so I'd definitely advise using checkinstall as dpgk can then be used to remove the resulting .deb when needed
          • Re: You get all your users to compile all your binaries for you?

            I'm not sure what you're trying to say here. You were trying to claim that portability was difficult, pointing to the example of Microsoft Windows where apps are only distributed in binary form compiled for one specific architecture--OF COURSE those binaries are going to be hard to port to another architecture! Source code makes that so much easier. That's why Open Source is inherently more portable than proprietary software. That applies to Open Source OSes like those built on Linux, as well as to the Open Source apps that run on those OSes.

            Proprietary platforms will simply never be able to keep up.
          • This has really looped...


            Source code is a pain and twenty years old. It is not how you port a system; if your building from source it's not ported in a usable way. So shall we stick with where I started

            "Of course those binaries are going to be hard to port"

            I've seen nothing here that says its easier, just more likely. Which is exactly where we started. It isn't any easier, infact harder and takes longer,and proprietary OS's do get ported.

            Your new argument seems to be that open source is more likely to get ported. As this was my second point shall we just call it a day? Of course open source is more likely to get ported. For the third ? Time; open source gives you the freedom of choice but it's not the clean, oiled machine.

        • It turned out very well. It's why they were easily able to port it to ARM

          after porting it to itanium, mips, alpha, and other cpu architectures. There's a very thin HAL. And oh yeah it already runs on x86 so no porting required for this. Yes their "mobile" os is windows.
          Johnny Vegas
          • Re: It's why they were easily able to port it to ARM

            Yeah, it only took them about 3 years. That's why Windows Phone 7 had to use the Windows CE kernel instead. That's why WP8 represented a complete reboot--no carryover of dev tools or APIs, developers have to start all over again.

            And it's why it's so easy to port ARM Windows to any ARM device--oh, wait, you can't; only Linux can do that.

            And it's why it's so easy to develop one code base to deploy across Windows Phone, Windows RT and desktop Windows--oh, wait, you can't.