Pages

Wednesday, December 26, 2012

Linux and the future of desktop

These days I have been reading articles and opinions in the net about the new Windows 8. Why? Because Windows 8 is the starting of the end of the desktop as we know it. Windows 8 is the first step in Microsoft roadmap to unify all the current platforms (Home computers, tablets and phones) into a single interface.

But using the same interface into so very different devices means you can only use the lowest common denominator of all of them, in this case, phones. And this also means you must use your desktop computer in the same way you use your phone.
The new interface is called Metro. From my point of view, it is a huge step backwards in computer usability because all applications run full screen on Metro: no more “multitasking”: You can't have two or three (or four or twenty) programs running simultaneously on your screen: You can have only one at the same time, just like in the Sinclair Spectrum era.

Oh yes, I can hear you yielding at me: “No, you are wrong. The desktop is still there and you can run all your applications on it...”. Yes. The desktop is still there, but for how long?. Windows 8 has a Metro Desktop application to run legacy programs, but it is clear for anyone in a future, maybe with Windows 9 or Windows 10, the desktop will be completely eliminated from Windows, being Metro applications the only ones the operative system will run natively. Another point is Metro applications can't be downloaded from web pages or installed from removable devices: The only way to install Metro applications is through Microsoft's markerplace.

Does this sounds familiar to you? It sounds very familiar to me: Google Market (now Google Play) or App Store from Apple. Even “classic” Windows applications are called “legacy applications” in Windows 8 world, and we all know what a legacy application is: http://en.wikipedia.org/wiki/Legacy_software

As ham radio operator I can't figure out how to do all my tasks in a single window system: Currently I can decode PSK31 signals while watching Youtube and talking with my friends using Gtalk, while the computer is transcoding a movie from my camera using ffmpeg. As amateur programmer I don't like to go through a Software Store without any alternatives to distribute my programs. None of these things can be done with Metro.

So, what is the future for the people who like and want to use the traditional desktop? Someone said Linux.

Linux... the truth is I have been using Linux (Debian stable) for almost a decade in servers, but I don't use Linux regularly as desktop computer since the days at the university, when fvwm was the standard window manager. I really like the Unix world, and I enjoyed the powerful of its terminals (My Windows computer has almost all GNU utilities like bash, grep, awk, sed, wc, head, tail, strings, less...) so I tried a modern Linux both in my laptop and in my desktop computer. For familiarity I used Debian Testing and also Linux Mint.

After almost two months of deep testing in my laptop and in the desktop, I can say Linux, in the year 2012 is still not ready for the desktop, and probably it will never be. Sad but true. This is not a Linux vs Windows war, neither a fanboy speech. If you feel so, please stop reading and go elsewhere in the net.

Today both Linux and Windows are rock solid operative systems. If you have unexpected blue screens of dead (BSOD) or Kernel panics, take a screwdriver and check for faulty electrolytic capacitors both in your motherboard and power supply. From my experience 95% of problems in modern computers are caused by defective electrolytic capacitors, and the remaining 5% is caused by overheating problems.

So, if Linux is rock solid as Windows, and has better security, and it is free (in both ways), why Linux does not get a significant presence in the desktop? The reasons would fill an entire book, but for me these are some of the problems I found preventing Linux to get to the masses or even to get into my desktop computer as the primary operative system:


Too many distributions

Yes. There are too many distributions. Currently there are hundreds of Linux distributions. Some people say this is good, but this is really bad: Very few users and extremely fragmented. Also most distributions are incompatible between them, so there is not a single Linux: There are hundred of Linuxes out there.


Hardware detection and autoconfiguration

When I played with Debian Testing on my laptop, I found my touchpad didn't select things at tapping neither made scroll. After search in Internet I found how to solve this problem: http://wiki.debian.org/SynapticsTouchpad
This is the kind of solution I would expect in the 90's, but not in a modern OS in the year 2012. If you have the “bad luck” of having a nVidia graphics card, the solution is:
http://wiki.debian.org/NvidiaGraphicsDrivers
Again a set of solutions not acceptable in the year 2012.

I had also problems using the Fn keys: Screen brightness worked nicely but not audio volume control. After spending an entire evening looking in Internet, I gave up. By the other side, I must say that Linux Mint 12, 13 and 14 detected and configured properly all these things just out of the box.


Google is your friend, but be careful...

When you have a problem with Linux (because it is sure you will have) you'll end looking for a solution in Google. You must be very careful with what you find: If you apply a solution for other distribution / version of the kernel / version of X subsystem / version of libc / any other thing, you can end with a severely screwed system. Also outdated solutions can harm your installation.

Check always the date of the proposed solution or even better, use your search engine's advanced search and limit the results to the last month o maybe last year, but no more! This will save your time.


Distributions are closed ecosystems

The whole idea of a Linux distribution is ok for a beginner or someone without too much expectations. For the average / advanced user is like a jail, a closed system where you can't install software without risks. If you are looking for a program, and that program is included into your distribution, you usually will no experience problems.

But if the program you want is not, you can run into serious troubles: If the program is in binary form, unless it is statically linked, it would be compiled for an specific set of libraries: Only if your distribution has those libraries, you will be able to run the program. This is one of the reasons most Unix programs are distributed as source code: The end user must compile and install it


Packages dependencies are just pain

One day your are browsing the package's descriptions using your software manager and you find a program whose description seems interesting, and you decide to install and try it, and all you get is:

Building dependency tree
Reading state information... Done
You might want to run 'apt-get -f install' to correct these:
The following packages have unmet dependencies:
vlc : Depends: vlc-nox (= 1.1.3-1squeeze6) but 1:2.0.1-0.6 is to be installed
Recommends: vlc-plugin-notify (= 1.1.3-1squeeze6) but 1:2.0.1-0.6 is to be installed
Recommends: vlc-plugin-pulse (= 1.1.3-1squeeze6) but 1:2.0.1-0.6 is to be installed
vlc-nox : Depends: libavformat53 (>= 5:0.10.2) but it is not going to be installed

Nice isn't it? The average Joe do not know how to deal with this, and a quick search in Google will give him hundreds of ways about how to solve it. It is not uncommon to play with dpkg / apt-get and specially aptitude and get the whole Xorg system removed. I must admit I have spent a lot of time fixing dependencies to solve conflicts like this, and this is unacceptable in the year 2012.

To be honest, this problem is more usual in rolling or semi-rolling distributions, like Debian Testing. But on other distros, like Linux Mint, you can find many other funny problems too, like the same file owned by two different packages (try to install mint-info-xfce and mint-meta-xfce at the same time), or some weird circular dependencies very hard to solve: A depends on B, but B is not going to be installed. B depends on A, but A is not going to be installed.


apt-get / aptitude is not so smart

One of the things I tested on Linux is the use of DVB-T dongles as SDR receivers. In Linux the RTL-SDR is used with GNU Radio, and the GNU Radio flowgraphs available out there need features from the latest GNU Radio version, wich is not the one available in any distribution, so the only way is to compile GNU Radio from sources.

To compile GNU Radio you need a lot of libraries, almost all of them available in the distribution, so I installed them using apt-get, compiled GNU Radio and all worked well.

But in one of the classical dependencies problem, there was a point where apt-get show me a huge list of installed packages no needed any more and encouraged me to use apt-get autoremove to purge them. Well, in that list of packages were libboost, python-numpy, libfftw3, and many others installed manually with apt to compile and run GNU Radio. Of course, running apt-get autoremove broke my GNURadio instalation. This is just one of the multiple problems you can get installing programs or versions not available in your distribution.


Distributions contains outdated packages. Always.

Because Linux distributions are closed ecosystems frozen at a given time, most programs does not get upgraded except for security reasons. This means by default, all programs are outdated, and more outdated as time goes by. If you see in the author's webpage a new version of your program is just released, it can get months to arrive to your distribution... if you are lucky, because most distributions does not upgrade software packages until the next release.

It's quite disturbing for me the core of any Linux system, the Linux kernel is systematically outdated in every distribution I tested. Right now the kernel 3.7 is available. Right now Debian Stable uses a 2.6 kernel, Debian Testing a 3.2 kernel, Mint 14 a 3.5 kernel, and so on...

Also it is worth to mention that some programs are modified from the author's version to suit dependencies or other aspects from the given distribution. For example, in Linux Mint we can see the available version of VLC is 2.0.4-0ubuntu1. The ubuntu mark means it was modified to suit some aspects related to Ubuntu distribution, so that VLC 2.0.4-0ubuntu1 is not exactly the VLC 2.0.4 you can find on VLC's webpage. This process can add new bugs, bugs that does not exists in the original version from the authors.

Another problem of tweaking programs is each distribution sets the default settings to different values wich can lead to bad experiences in a first try if you are not familiarized with the program already, because each group of developers sets a default configuration wich usually fits their taste but maybe not yours.

That's the reason for some programs or tasks, some distributions are more recommended that others.


Install different graphical environments at the same time is a no-no

If you want to use Linux, the first thing you must to choose is a distribution. Once you have chosen one, you must choose its flavour: Gnome, Gnome Shell, Unity, Mate, Cinnamon, KDE, XFCE, LXDE, etc, etc, etc... You can think to choose one, and then install the other ones to try and see which one best suits your needs. That's ok in theory, but in practice it will give you a lot of problems.

For example your menu will become a complete mess and you will end having several applications to configure the same things (like mouse behaviour or energy saving options) and of course, having different and contradictory settings in each one!.


Not reliable upgrading

While playing with Linux Mint two new distributions came out: Linux Mint 13 and Linux Mint 14. Upgrading from 12 to 13 or from 13 to 14 was not an easy task. I dealt with lot of dependencies problems and deep knowledge of dpkg, apt-get and aptitude is really needed. In the end, some parts of the system didn't get upgraded, ending in a mixed upgraded / deprecated state.

In fact the recommended way to upgrade from one distribution to another one is reinstall the whole operative system. Yes, it is not a joke! Another unacceptable thing in the year 2012.


General slowness

In general, Linux feels slower than Windows running in the very same computer. Firefox or Chrome runs slower in Linux that their Windows versions (remember: in the very same computer). Intensive javascript pages also run slower in Linux (for example Gmail is quite slow for me using Chrome on Linux), not to mention about modern web features, like WebGL...

Flash player (Youtube) is almost a joke in Linux: Very high CPU usage, very low frame rate, and even in high end systems you can see how frames are drawn on the screen (especially in scene changes). Other programs like VLC or LibreOffice also feel slower and less responsive than the Windows versions (remember: in the very same computer). I suspect all these problems are related to the quality of the Linux graphical drivers.


Lack of applications

Linux lacks applications. If you are a casual user and your usage of a computer is limited to browse the web, hear mp3 music, watch some movies and few more things Linux will be enough for you. If you are a technical user, you will suffer from the lack of applications.

For example, I'm a ham radio operator so I'm also interested in electronics. There are very few ham radio programs on Linux (none of them AAA+++ titles) and most of them severely outdated, being almost abandonware. The same occurs for example to amateur astronomers (another of my hobbies).

The Linux solution for this is to use Wine. Wine adds a compatibility / translation layer to allow Windows programs to run natively on Linux. It works pretty well with simple programs (like Spectran, Argo or MMANA-GAL) but many other programs needs several tweak to run with Wine or doesn't run at all.

In this way you can have your Linux computer and run most of your Windows programs. But what's the point to use Linux to badly run Windows programs?


It's all in the little details

But the most annoying experiences don't come from programs or utilities. They come from Linux's little details

For example, With XFCE, Pidgin (and other programs) continuously steals focus, so you will end typing your passwords or other sensible data to your friends. There are several settings to prevent focus stealing, both in Pidgin (inside a plugin!), in the window manager and in XFCE settings. Which one works? Nobody knows. You finally end trying all possible combinations until it just works... or close enough No application should steal focus. Ever. Under any circumstance. From this point of view, Windows version of Pidgin works much better.

Another little annoyance is there is two ways to copy & paste data or text between applications. You have the modern Ctrl-C Ctrl-V approach, but there is still working the original X selection (which is pasted using the middle mouse button or the wheel). A lot of times I select text with the mouse (because it is faster than select it and type Ctrl-C) and tried to paste it with Ctrl-V (because it is faster than press the wheel down without rolling it) and of course it does not work.

I have had also problems with removable devices, like USB pendrives. Erasing files in a pendrive actually do not remove them. They are moved to a .Trashcan directory. But emptying the trash does not actually empty the trash, so the usb device free space shrinks until it reach zero. I always ends firing a terminal to manually delete the .Trash directory and return the drive to a real empty state.

I had also problems reading files in a Compact Flash card from Nikon Coolpix 5700 camera. Linux refuses to mount the file system saying there is an incorrect number of FAT entries: 252. The card, who is a simple FAT16 filesystem reads perfectly on Windows.

Is there a way to know what device is your new unformatted and unpartitioned storage device without look in the syslog?

Linux fonts are ugly, very ugly, and looks fuzzy and blurred causing eye strain and headaches. The problem comes from that thing called antialiasing. Antialiasing is a tecnique to make your screen blurry and produce headaches. The first thing I do in every operative system I install is disable antialiasing (even in Windows, where it is called ClearType).

In Linux antialiasing can be disabled in your desktop environment settings, but you will discover soon some programs (like Firefox and Chrome) are still using antialiased font rendering. To solve this you must edit the configuration in /etc/fonts/conf.d by hand. Interestingly in my laptop, who uses a 262.000 color TFT panel with only 6 bit per color, antialiasing is not an issue. But in the desktop TFT panel with true 8 bits per color, is a nightmare. Installing Microsfot fonts (ttf-mscorefonts-installer) and configuring the system and the browsers to use them really improves the overall Linux appearance and experience.

Thunar is the file manager for the XFCE environment. When one or more files (or directories) are selected, they can be deleted using the Delete key in your keyboard without questions or confirmations. In a laptop is very common to press the wrong key by accident, and in this way you can delete your entire home directory without any confirmation or warning. I tried also Nautilus or PcMan, but all they have similar problems. I don't feel safe modifying my filesystem using these file managers.

XFCE desktop icons are labeled by default with black letters. This is ok if your background is bright, but I like dark blue backgrounds, and black letters with dark blue background are difficult to read. After spending a weekend trying to find where to change the desktop icons font color, it resulted it is defined in the file ~/.gtkrc.xfce4  or  ~/.gtkrc-xfce (according to your distribution) that must be edited by hand with the hexadecimal codes for the new color you want. Another very basic configuration that must be configured editing files by hand in the year 2012.

LXDE's default icon size in the desktop is way too large. I spent two days looking for the icon size settings. They are in PcMan (LXDE's file manager) preferences. Pretty intuitive, isn't it?


The reality

The truth is Linux has doesn't changed too much since the last time I used it as desktop machine (or workstation as it was called then). The same problems Linux had more than 10 years ago still exists today. But something wrong happened in all these years. The graphical interface moved from simple window managers, like fvwm to desktop environments, like Gnome or KDE.

Then these ones lost their way becoming more a demonstration of their respective programmer's skills than a useful desktop environments for users. This is the explanation of the rise of LDXE or XFCE: Desktop managers anchored in the 90s being still used in the year 2012 with great popularity just because there is nothing better. Clearly something went wrong.

As user I do not have strong feelings to migrate to Linux right now. Why?  Most the programs I could use on Linux exists on Windows (and most of them works better on Windows), plus there are tons of specialized software I should try to run under Wine with the respective tweaking. Add the annoyances just described above and the result is a less comfortable working environment. Let's face it: Windows today is good enough.

As amateur programmer, the Linux world do not seduces me. Too many incompatible distributions, too many incompatible graphical environments, too many incompatible toolkits (GTK+, QT, Tk, wxWidgets, (Open)Motif, Lesstif, plain X, etc), too many incompatible sound libraries (OSS, SDL; ALSA aRts, ESD, libao, etc), and the worst of all, very poor forward and backward compatibility. This is not the kind of attractive platform for a new project. Surprisingly the most full featured platform for programming is the less attractive one for new projects.


What Linux needs to be a real alternative?

Interestingly, from the user point of view Linux distributions are closed environments who run more like a Cathedral, while Windows is an open environment who runs like a Bazaar. Linux needs to become more like Windows (But not exactly Windows: we have Windows already and it is good enough).

Only one distribution. Please, no more desktop distributions. Too many distributions for too few users. Get all the manpower focused into a single project. Stop reinventing the wheel over and over again wasting manpower. Of course we need specialized distributions (for PPC machines, ARM, Amigas, Embebed systems, etc) but please, only one desktop distribution.

Only one desktop environment. Again, too many desktop environments. Linux only needs one simple and powerful desktop environment. Look at OSX or Windows: Only one desktop environment, very simple, but very powerful, with productivity in mind, with a fully integrated and powerful file manager and the right amount of eye candy. That's all!

A distribution must care only about system, libraries, and the desktop environment. No one distribution could ever take under control millions and millions of applications (let's be optimistic). It is impossible. Let that task to the software developers. Distributions must only care about to bring a base system, with all the required libraries up to date to run newest and oldest software without any problem. Maybe a few utilities, like text editors, or basic web browsers, but little more: Let the user get the software from the developer's web pages and make it sure he can install it on the system easily without breaking anything.

A common package / installer. There must be a single way to pack a Linux program and then install it everywhere, both in binary form or in source code. Call it deb or rpm or tar.gz or whatever you what, but only one.

Trust in software authors. If the developers of my favourite program decide a new version is ready to be released, I must have a way to install that version directly, without having to pass to the distribution approval, a process that can take months. PPA is a good step in this direction, but dependencies makes PPA to work only for one distribution.

Rock solid backward and forward compatibility. Give programmers a (very) long term stable platform to run their programs. Give hardware developers a long term stable platform to run their hardware drivers. Do not break interfaces, do not break libraries, do not break API or ABI. Make backward and forward compatibility the rule number one. Don't feel bad for having 25 versions of the same library in the hard disk: Today's hard disks are measured in terabytes!

No more reinstalls. The desktop Linux must be able to run for years (or even decades) with seamless and painless upgrades. Nobody wants to reinstall the system (and all their programs and data) every 6 months. Just look to Windows here. I had XP installed in my desktop computer for almost 10 years and it went in a nice way through all Service Packs and Windows Updates without a single problem.

No more silly dependencies: Flat dependencies. If I want to install a simple program like VLC in Ubuntu I must do: sudo apt-get install vlc vlc-plugin-pulse mozilla-plugin-vlc These instructions come from here: http://www.videolan.org/vlc/download-ubuntu.html. This is silly. It is also quite silly to get the same program (VLC) split into several packages: browser-pluging-vlc, libvlc-dev, libvlc5, libvlccore-dev, libvlccore5, mozilla-pluging-vlc, phonon-backend-vlc,  phonon-backend-vlc-dbg, remuco-vlc, vlc, vlc-data, vlc-dbg, vlc-nox, vlc-plugin-fluidsynth, vlc-plugin-jack, vlc-plugin-notify, vlc-plugin-pulse, vlc-plugin-sdl, vlc-plugin-svg and vlc-plugin-zvbi. This is insane. Why? All these packages currently have the version 2.0.4-0ubuntu1 in Ubuntu. If a new version of VLC is released all this 20 packages will be replaced so it's just a nonsense. Just one package, with no dependencies, just like Windows: One package, all options, no dependencies.


Why this will never happen?

This will never happen just because never happened. Linux was born 20 years ago. It has had enough time to become the leader OS in the world, but it is still stuck at a constant share of around 1% of popularity. Why? There are several reasons:

Not defined direction: Nobody has goals, so there are no milestones to reach. Many FOSS programmers do not realize some programs are done, and further improving adding features nobody asked for will lead to frustration of users. Some good examples are Gnome and KDE, although it seems KDE is returning to its way.

Linux seems to be a toy for its programmers: Like child with their toys, FOSS programmers usually do not accept suggestions against their ideas or own usage. Also the FOSS programmer's way to solve a dispute is to fork projects. In this way we have ended with hundreds of distributions, a dozen of package file formats, dozens of desktop environments and/or window managers, dozens of programs for exactly the same purpose... Yes, of course it is all about the freedom of choice. Just look where the freedom of choice has lead Linux: To be almost irrelevant as operative system for the masses and just a curiosity for programmers.

There is not a leader: The Linux Kernel has a leader: Linus Torvadls. Linus gives the directions under its own ideas and goals. As far as I know the kernel has not been forked by unhappy programmers so I guess he is doing right his job. Apple had its lider: Steve Jobs. Does Linux Desktop need a leader? The answer is probably yes.

For many FOSS programmers, Linux is ok, so why change? Other FOSS programmers do not want to see closed source software running on Linux. Linux will not survive as desktop operative system without closed / commercial software: The same happened with hardware. Do you remember why the PC becomes the king of personal computers instead of Commodores, Amigas, Ataris, Macs, or many other computer in those years? Because many programmers did nice programs (free and commercial) for it. The same case for Android. Just face it.

Incredible large amount of manpower is wasted in reinvent the wheel over and over again. Every week, every month, every year. This is a total nonsense. Just imagine what could be done if all those manpower is done in the same direction.

I suspect Linux will not be significant in the desktop until some company take the Linux kernel and starts to build a (new) system over it, just like Google did with Android. Of course that new system will not be called Linux: Today, after 20 years, the word “Linux” has many bad connotations for the people (Hard to use, only for geeks, etc). It will be something else, but not Linux.


The future for Desktop users

I don't know what will be the future of desktop users. Windows 8 sales have not been good ,any serious user just hate the new interface. Some people say Microsoft has failed and is declining.  Windows 7 end of life is scheduled for year 2020 so there is still time to work with the desktop.

But most new computers sold with Windows 8 will have UEFI, which can prevents the boot of any unsigned operative system, including Windows 7 / Vista / XP and of course Linux. Unless manufacturers take matters into, most new computers will only be able to run Windows 8 exclusively. Very bad news for the desktop users.

I like the move of Linux Mint with Mate and Cinnamon, I think they are in the right direction, but they have a long way to go, because the base system is still the problematic Linux with all the annoyances I told in this post. Sadly I don't expect any serious change from the FOSS community so I guess Linux in ten years will be exactly as it was ten years ago, unless something changes it.

And maybe something is changing it right now. Valve Software, a game company, stated Windows 8 is a catastrophe for games and decided to port its game platform to Linux. Although I'm not a gamer, I understand the role of games in a platform, so this can be a new opportunity to get users into Linux, but it can be also a way to demonstrate to new users how annoying Linux can be, so I'm not sure if this will help Linux to get into the desktop right now.

Searching for solutions for the problems I found, I got into some interesting pages about the status of Linux on the desktop, and this really surprised me: In the past you couldn't complaint Linux is bad without getting in problems. But now I have found relevant Linux personalities questioning about the Linux trajectory and the fail in the desktop: the first step to solving a problem is admitting you have one.

But for me one of the most interesting pages I found is: Major Linux Problems on the Desktop, a complete technical description of all current desktop Linux problems. Read the inline links and the user comments: They are very interesting. More interesting articles here, here, here, here, here, here, here, here, here, and experiences from users giving up herehere, herehere, here, and here.

It is clear there is a problem with Linux, but it is no clear it will be, or can be solved just because Linux is made by very different projects, managed by different people with no common goals. The only thing clear for me now is I'm concerned about the future of desktop: Windows wants to kill it and Linux is not ready for it.

7 comments:

  1. Hello Miguel,

    I agree with you, I use Linux exclusively since 1997 and did run on all the issues you mentioned.If you remove the X server and the window manager from the "equation" then Linux excels in terms of price/quality.

    ReplyDelete
  2. That is why Linux rocks in servers: They do not need graphical interface and they are rarely upgraded except for security reasons. But server policies are not applicable at desktops. Desktop computing is a very different beast.

    ReplyDelete
  3. You want something like upcoming 2014 Ubuntu - one desktop system across phone/tablet/PC with one graphics server (MIR).

    On the other had I do not want to use Unity, Gnome shell or KDE4 and I would not use Linux if such desktop was the only one option. Some sane amount of choice is the essence of Linux and open source. Reinventing the wheel (or doing it because "we have to be the creators") isn't that good and the "new" desktops divided the community and made a lot of new users to XFCE or LXDE :)

    But still there is a lot of issues modern desktop Linux has. They won't be solved quickly as it seems that desktop / display server reinventing is more important to the key distro and software makers ;)

    ReplyDelete
    Replies
    1. I don't thing so. I really *HATE* Unity interface ;-)

      A desktop have very different needs than a tablet or a phone, so it is impossible to use tablet interfaces in desktop computers. Canonical learned it in the hard way. Microsoft also learned it in the hard way.

      I know the classic desktop will survive in Linux. My concerns are if Linux will be capable of it. As May 2013, it is not.

      Delete
  4. As a Linux user from year 2000, I have enyoied very much your post here. I agree with you in all your comments.I think that I use Linux only cause I am a masoquist. The lack of software is for me the most important drawback.I use Debian Stable and recently changed yo the new GNOME desktop. I was completely lost, I spent more than a day trying to figure out how yo do the dame things I did with the old GNOME.I bought a VNWA3 network analyzer that ended working under VMware.too much pain...

    ReplyDelete
  5. I agree to many of the points raised.

    The one and only real issue that keeps me from using Linux as my main system is:
    ******font rendering

    I (resp. my eyes) cannot stand how fonts are drawn in X (resulting in headaches, aching eyes, getting tired very fast, constantly trying to refocus etc.)
    First of all, I need to disable any kind of anti-aliasing (and this means: subpixel rendering, but also (!) includes mere grayscale anti-aliasing).
    What I then get, however, are jaggy and thin fonts that are hard on my eyes, too - though to a lesser degree than any kind of libfreetype (or whatever is to blame...) font smoothing (with or without subpixel rendering).

    By contrast, in windows (XP, but also in 7), I avoid any kind of Cleartype rendering (= subpixel rendering).
    But this does not mean that I avoid antialiasing altogether!!
    I can easily use the older technique called "standard font smoothing" in Windows (even in windows 7), which is mere grayscale anti-aliasing.
    As I just mentioned, Linux has that, too - but in contrast to Windows, it is just not sharp enough, a sligh amount of too much blur (and yes, I tried virtually everything - including infinality patches and all kinds of fontconfig settings).

    So when you write that you disable antialiasing on all systems you use (and you explicitely mentioned Cleartype), are you sure, you disabled *any* kind of antialiasing in Windows, or do you (maybe?) use grayscale antialiasing instead of fully unsmoothed rendering?

    Anyway, this is the only problem for me with Linux (but it is a show-stopper)!
    And I do have a long history of experience with Linux (in various forms and shapes).

    -kli-

    ReplyDelete
    Replies
    1. On Windows systems, I disable ClearType and font smoothing. This is the only way to get a proper display although with this configuration, some web fonts are broken.

      In Linux I can get very decent results, if I install Microsoft core fonts, set the system to use them, and disable font smoothing completely (KDE). This works for most applications, but there is still some GTK applications that uses smoothing/Antialiased fonts, which is very annoying... and of course, still broken web fonts.

      Delete