Ironically enough, it was the mobile devices that drove me to it. This is going to take a bit of an explanation, however, so go grab a fresh cup of coffee and a snack, and make yourself comfortable.
I've always been a big fan of personal computers, ever since I was first introduced to the Apple II and Commodore 64. I never owned either of those systems, and was a bit envious of family and friends who did, but somehow I instinctively knew how to use them effectively, even as a child. The first personal computer purchased by my father for our household was an Apple Macintosh LC (low-cost color), which served us well for typing papers and entertainment in the form of video games such as Sim City 2000. It also made it a relatively simple task for me to adapt to the Macintosh computers that were the most prevalent systems to be found in the computer labs at Syracuse University, albeit in a Unix-networked environment.
However, the first personal computer that belonged solely to me was a Windows 95 custom desktop offered for the low price of $1,200 at Staples, and I rather liked it. The computer had an intel Pentium 100 processor, 16 megabytes of memory, a 680 megabyte hard drive, 2x speed cd-rom drive, 14,400 baud dial-up modem and an Avance Logic SVGA graphics card with 128kb of on-board memory (expandable up to 256kb). It was not particularly fast, nor was it particularly slow. It did everything I wanted a desktop computer to do at that time - it permitted me to play Star Wars: X-Wing and Star Wars: Dark Forces, it gave me access to the internet, and I even partitioned the hard drive so I could install Slackware linux, my very first introduction to the linux world.
Once I entered the work force, this computer was upgraded with a better processor and motherboard, more ram, and a better graphics card so it could support newer games like Jedi Knight (Dark Forces II) and modern computing. I added a network card so that I could take advantage of the DSL internet service obtained for my shared living space. I eventually built a new computer with even faster components, upgraded to Windows 98. Still later upgraded those components again, upgraded to Windows Not for Me (aka Windows Millenium Edition) and finally upgraded to Windows XP before purchasing my first laptop computer. On this laptop (a Medion Akoya LS) I installed Ubuntu linux for the first time and became an ardent fan of the distro. Very quickly it became my operating system of choice for almost all aspects of computing.
By this time I had been in the field of computer repair for several years and had grown weary of all the security pitfalls inherent in Windows operating systems, particularly in Windows XP. Even though I would continue to sell and recommend Windows systems to my customers, I had reached a point where I personally chose to use linux as my primary desktop environment. After my children managed to hose our family computer with a particularly virulent trojan infection, I banned Windows from our home. All computer systems in our household from that point forward (until recently) either ran Mac OS X, iOS or some distribution of linux, typically Mint or another Debian derivative. I had already given up on Ubuntu due the company's change in direction from developing an easy to use desktop operating system to a bizarre, un-intuitive mobile-device/desktop hybrid. I had also picked up a second-hand Macbook Pro to use as my personal computer, and was now using OS X as my operating system.
So with that summary out of the way, here is where I currently stand:
A desktop operating system is intended to be a platform that makes it easier for the user to take advantage of what a computer can do, without needing to be a programmer or computer engineer. Every time I ran into circumstances where the operating system failed to do this successfully, or otherwise introduced additional burdens to facilitate its use, I abandoned that platform.
In regards to Windows XP, I abandoned the operating system (and subsequently Windows Vista, 7 and 8) because I perceived that those operating systems had a severe lack of security. Even though Microsoft began to invest in securing its product with the release of Service Pack 1 for Windows XP, and each release since then was a big step in securing their operating systems, they were still regularly targeted by malware developers who would either compromise the entire system, making it unusable, or otherwise compromise the security of sensitive activities, such as online banking. I abandoned Windows in favor of Ubuntu linux, which I perceived to be a superior desktop operating system in terms of security, even though it was inferior in ease of use.
In regards to Ubuntu linux, I abandoned the operating system as soon as Canonical switched its focus from ease of use to making the next "new thing" that would somehow make the company profitable. As far as I know, Canonical to this day is still not making a profit, and it has succeeded in alienating a significant portion of its user base which may have eventually helped it to become profitable. Why Ubuntu ignored Redhat's business model is beyond me. I switched to Debian for a while, which was less easy to use than Ubuntu, tried Crunchbang (#!), Mint and Arch and finally reached my limit with Arch.
In regards to any distribution of linux as a desktop environment; when I found myself spending 10-30 minutes a day fixing something that broke after installing security updates I rapidly began feeling that the operating system was no longer useful, especially when the recommended fix in more than one scenario was to "reinstall the operating system" (this should be something done as a last resort, not as a recommendation). The specific issue that broke the metaphorical camel's back was that I was using an AMD APU and R7 graphics accelerator in Dual Graphics (Crossfire) mode, which is currently unsupported by the open source AMD driver and results in the computer hard-locking, requiring a manual restart. The proposed fix was to install proprietary drivers from AMD, which resulted in my system booting to a black screen no matter what configuration changes I made. That sort of crap just gets in the way of getting something done. I don't have time for that anymore, and I only foresee myself using linux as a server operating system in the future (something which it is best suited for).
In regards to OS X as a desktop environment, the issue is twofold. One, the Macbook Pro I purchased second hand is almost ten years old. The fact that it still runs well is a testament to the engineering that went into this particular laptop, however, it also means the newest version of OS X that supports this laptop is Lion, and it is rapidly approaching its end of life in terms of security updates. Two, I kept finding myself banging my head against what I thought were simple issues to resolve. For example, I wanted to install the Atom text editor as a development tool, and even though it was available to be compiled from source, I couldn't. There was no version available to install on Lion, and I couldn't compile one because the dependencies couldn't be met. I couldn't upgrade and I couldn't install a (relatively) simple application, simply because of the limitations imposed by the operating system, and Apple's refusal to support a computer with a graphics card that did not support Shader (GLES) 2.0 - essentially a 3d rendering limitation. I'll be impressed if anyone ever successfully explains to me why an operating system needs 3d rendering support. The solution of forking over money for a newer model is not satisfactory in this scenario - aside from the lack of shader support, this computer is perfectly usable, and currently running Windows 7 in Boot Camp without any issues whatsoever. This is another classic example of an operating system getting in the way of the user.
In regards to the iOS operating system, which is not a desktop but is certainly the most popular operating system in use today, desktops included - Apple has created an annoying walled garden that can only be breached by overwriting the firmware of the iPad or iPhone with a hacked version of the operating system. In other words, you cannot install software that Apple does not let you install unless you compromise the device's security. Worse, you cannot access data on the device easily, and certainly not without purchasing apps that make such data access possible. The day I found out that I could not plug in an iPad to a MacBook and have direct, immediate access to the photos stored on the device was the day I swore I would never purchase one for myself. Unfortunately, because we have several in use in our household, I have the misfortune of needing to provide support for these shiny turds on a frequent basis, and the frequency with which the digitizers on these devices shatters is simply awe inspiring.
In regards to the Android operating system, I have yet to experience an Android device which does not slowly degrade over time, eventually requiring a factory reset of the firmware. The performance degrades as well, meaning the longer you have an Android the slower it gets (gee that sounds familiar, I seem to remember Windows users complaining about this). The thing is, Android is based on linux, and therefore if it were using a typical file system this would not be an issue. Unfortunately, cell phone companies are not okay with making all aspects of a cell phone freely available to the cell phone owner, resulting in the crap we now have today. When I was ready to buy a new smart phone last year, I came to the realization that the reason iOS is so popular is because Android is cheap garbage, and based on the new huge gaping holes in Android security, I do not foresee myself purchasing an Android device anytime in the near future.
So what type of smart phone did I buy? A Lumia. A smart phone manufactured by Nokia (with a certain reputation for ruggedness) running Windows 8.1 mobile, with future support for Windows 10 mobile (which I am currently running even though it has not yet been officially released). The biggest limitation of this device is the lack of apps when compared to an iOS or Android device, but in terms of a mobile device with a reliable operating system that does everything I need a smart phone to do, it performs with flying colors. Not only that, but it was affordable, and I am paying a very reasonable monthly fee for service without a contract through Cricket.
Even though I had established a prior bad relationship with Microsoft based on their security troubles, there were other factors I had to consider. I was no longer a repair technician but a developer, and after extensive use of Visual Studio in the course of my work I can honestly say it is one of the best pieces of software I have ever worked with, hands down. Every version released is an improvement on the one released before. Visual Studio requires Windows, and therefore Windows has been my primary operating system on a daily basis as a direct result. What I have found is that unlike OS X or linux, Windows typically does not get in my way. In general, it does a very good job of getting out of my way. Windows 8 is an obvious exception, but 8.1 has resolved most of the annoyances and for the most part I don't have an issue with it.
In addition, with Windows 10, Microsoft for the first time realized that its operating system was not its primary source of revenue, but rather a gateway to better sources. To me, this is an indication that Microsoft has started to wake up and realize it needs to think and do differently to remain relevant, and it is taking steps to do so. Windows 10 is a pleasure to use because it gets out of my way and lets me work. It is intuitive to use. It has not (as yet) caused me any issues that I could not quickly and easily resolve.
While there is still some concern regarding user privacy based on the data collected by Windows 10, I think that overall it's a step in the right direction, and look forward to seeing what else will be coming down the pike. Without being dismissive of privacy, I'm also a bit jaded in that I don't think there is a computer device that connects to the internet that cannot be compromised, whether by hackers, government funded organizations or something even more nefarious. Consequently the privacy is not as much an issue for me as it might have once been. I've come to accept that in this world, the current status in the Internet of Things is that privacy has been compromised for a long time, and understanding that will lead to smarter practices by the users of those devices.