Computer literacy, help and repair

Brief description of Linux and its components. OS "Linux"

There are many different ones and one of them is "Linux". What is it and where is it used? How is it organized? What are the differences from the Windows familiar to most people here? "Linux"? All these questions will be answered within the article.

What is Linux

This is an operating system, relative to which you have a wide choice of software that will be installed. It allows you to choose between several types of desktops and about a dozen shells for the command line. The latter, by the way, is called the terminal and plays a very important role. Due to the focus on executing several programs at once, the operating system is relatively less prone to failures. Due to the fact that by default no program can start without the user's knowledge, it is very difficult (although possible) to encounter a virus written for Linux. But if they infect computers, it is solely due to the fact that a user has launched a suspicious application. Some samples of this system can be easily placed and work via optical drives or USB.

Development history

Linux in Russian sounds like "Linux", that is, it is not translated. Do you know why? All because it is named after its pioneer founder, Linus Torvalds. He presented his creation to the public in 1994 (although development began as early as 1991). Due to the fact that the operating system was open source, and everyone could modify it, he had followers around the world. She is beginning to be interested in various companies that produce paid distributions. At the same time, communities of developers are emerging that create and distribute their builds on a volunteer basis. And for 2016, there are about a dozen popular versions of Linux operating systems. What does this state of affairs tell us? This means that there is strong competition, mutual assistance (no matter how strange it may sound) and diversity. In addition to popular versions, less common ones are also widely used, which are often aimed at performing a certain range of tasks. And now, if you are interested in Russian "Linux", know that downloading it is not a problem. And for others, there are Russifiers that will translate at least part of the designations.

"Linux" and "Windows": what are the differences

At first glance, the data does not look much different. But it's not. There are differences, but to see them, you need to look "deeper". The focus will be on Linux, but to give you an idea of ​​what and how, a comparison will be made with Windows. We will consider:

  1. File system.
  2. Graphic shell.
  3. System configuration.
  4. Scope of application.

And in conclusion, a little attention will be paid to distributions.

File system

Initially, it should be noted that there are no hard drives in the usual sense in Linux. What does this state of affairs tell you in the first place? Difficult? By no means! Logical partitions and physical disks will be preserved, only here they will be represented by directories. The operating system connects to some folder, and everything that is created falls into it. Work can only be done with the files that are in it. The most important folder is the root. It is denoted by /. For example, user files are usually stored in /home/username/. But at will, you can change the location of almost anything. Is it difficult to navigate through such a structure?

Graphical shell

At first, after Windows, it may seem that moving is inconvenient. But this is only when using the terminal. There are also graphical shells with which you can work with the familiar interface. The terminal, by the way, is used either on servers, or by very advanced people with a good memory. Graphically designed shell is suitable for everyone else, as well as those who simply do not want to remember the exact catalogs of information. The Linux system can appear in a variety of forms. Several desktop options can be configured, and already the graphic design for them is calculated in hundreds of variations.

Configuration

On Windows, this information is located in the system registry, which is a specific database. It is necessary so that the programs launched in the operating system are correctly configured when they are loaded and at the beginning of work. In it, if the registry is damaged, then only a complete reinstallation will help. Although this has an advantage - everything is in one place. But Linux took a different path. In it, each program is the owner of a separate configuration file (and sometimes even several). They can be viewed or edited with an arbitrary text editor. If there are several files, then this has its advantages - so if one of them is damaged, only part of the developments will be lost. And in the worst case, you will have to reinstall only a separate program. When the user decides to change the used computer, he does not have to start all over again. He can simply copy the necessary files between machines (directly or using media). What to do if the operating system is damaged? This is an important advantage that Linux has. The operating system needs to be reinstalled and you need to do it. But the file settings after this process will be saved and will not be lost. Although there is one small drawback here - each program has its own format of configuration files, and it will be necessary to understand well what and how before editing.

Applications

We talked about this and how it differs from the familiar to most users of the Windows system. And now let's pay attention to the areas of its use. Unfortunately, it will not find application everywhere. So, "Linux" server or home / office workstation - here it is close to ideal. The fact is that for this operating system there are not many ported or created games. There are also separate programs that run Adobe graphics packages and complex engineering programs (like AutoCAD, MatLAB and the like). But with the help of emulation, this problem in most cases can be solved. Not always people need a computer that solves specific problems. Therefore, I suggest that you familiarize yourself with a short list of the system's features, and if it does what you need, you can try it in practice and not be disappointed:

  1. The system can create office documents, spreadsheets, various documentation and process texts.
  2. It is possible to view and edit images and photos.
  3. You can listen to music and watch videos.
  4. You can play games on the system - but usually very simple ones.
  5. Carry out communication processes via the Internet (in this regard, Linux is head and shoulders above Windows).
  6. Convenient to program.
  7. You can explore the Internet.
  8. It is convenient to remotely manage operating systems (any).
  9. Various messaging programs and e-mail are available.
  10. Convenient file sharing.
  11. Free operating system
  12. And in the end - viruses exist here as a myth. At least the author has not met one in his entire life.

About distributions

Already want a Russian "Linux"? Then it is necessary to say a little about distributions. On the one hand, it is difficult to say that we have different operating systems. On the other hand, no one can prove otherwise. You can choose your preferred language and geographic orientation, which will help make Linux more user-friendly. Russian version, French, Brazilian - the choice is yours. With the help of the operating system, you can seriously study foreign languages. But let's talk about distributions. Their main differences are the location of programs in the file system. This is not to say that one distribution is better than the other. It all depends on the tasks that are in front of you. So, you can install one version in which you can only program, and there is poor Internet support (only text components are loaded). This option is suitable for those who cannot concentrate on work and are constantly distracted. There are also those that will allow you to best adjust the power to get the optimal performance of the machine. There are a lot of choices, and only you should stop at one of them.

Due to the fact that Linux source codes are freely distributed and available to the public, a large number of independent developers joined the development of the system from the very beginning. As a result, Linux is currently the most modern, stable and fast-growing system, absorbing the latest technological innovations almost instantly. It has all the features that are inherent in modern full-featured operating systems such as UNIX. Here is a short list of these possibilities.

Real multitasking

All processes are independent; none of them should interfere with other tasks. To do this, the kernel implements a time-sharing mode of the central processor, alternately allocating time intervals for each process to execute. This is quite different from the "preemptive multitasking" mode implemented in Windows 95, where the process itself must "yield" the processor to other processes (and can greatly delay their execution).

Multi-User Access

Linux is not only a multitasking OS, it supports the ability of many users to work at the same time. At the same time, Linux can provide all system resources to users working with the host through various remote terminals.

Swapping RAM to disk

Swapping RAM to disk allows you to work with a limited amount of physical RAM; To do this, the contents of some parts (pages) of RAM are written to a dedicated area on the hard disk, which is treated as additional RAM. This slightly slows down the speed of work, but allows you to organize the work of programs that require more RAM than is actually available in the computer.

Memory paging

Linux system memory is organized in 4K pages. If the RAM is completely depleted, the OS will look for long-unused memory pages to move them from memory to the hard disk. If any of these pages become needed, Linux restores them from disk. Some older Unix systems and some modern platforms (including Microsoft Windows) flush to disk all RAM content related to a currently idle application (i.e. ALL memory pages related to an application are saved to disk when out of memory) which less efficient.

The Linux kernel supports on-demand page allocation, in which only the necessary part of the code of the executable program is in RAM, and the parts that are not currently in use remain on disk.

Sharing executable programs

If it is necessary to run several copies of an application at the same time (either one user launches several identical tasks, or different users launch the same task), then only one copy of the executable code of this application is loaded into memory, which is used by all simultaneously executing identical tasks.

Shared Libraries

Libraries are collections of procedures used by data processing programs. There are a number of standard libraries used by more than one process at the same time. In older systems, such libraries were included in every executable file, the simultaneous execution of which led to an unproductive use of memory. Newer systems (Linux in particular) provide support for working with dynamically and statically shared libraries, which can reduce the size of individual applications.

Dynamic disk caching

Disk caching is the use of part of the RAM to store frequently used data from the disk, which significantly speeds up access to frequently used programs and tasks. MS-DOS users work with SmartDrive, which reserves fixed areas of system memory for disk caching. Linux uses a more dynamic caching system: the memory reserved for the cache increases when the memory is not in use, and decreases if the system or user process needs more memory.

100% POSIX 1003.1 compliant. Partial support for System V and BSD features

POSIX 1003.1 (Portable Operating System Interface) specifies a standard interface for Unix systems, which is described by a set of C language routines. Now it is supported by all new operating systems. Microsoft Windows NT also supports POSIX 1003.1. Linux is 100% POSIX compliant. Additionally, some System V and BSD features are supported to increase compatibility.

Linux uses IPC (InterProcess Communication) technology to exchange messages between processes, use semaphores and share memory.

Ability to run executable files of other operating systems

Linux is not the first ever operating system. For previously developed operating systems, including DOS, Windows 95, FreeBSD or OS / 2, a lot of different software has been developed, including very useful and very good software. DOS, Windows 3.1 and Windows 95 emulators have been developed to run such programs under Linux. Moreover, Vmware has developed a system of "virtual machines", which is a computer emulator that can run any operating system. There are similar developments in other companies. Linux is also capable of running binaries from other Intel-based Unix platforms that conform to the iBCS2 (intel Binary Compatibility) standard.

Support for various file system formats

Linux supports a large number of file system formats, including DOS and OS/2 file systems, as well as modern journaling file systems. At the same time, Linux's own file system, called the Second Extended File System (ext2fs), allows for efficient use of disk space.

Networking

Linux can be integrated into any local network. All Unix services are supported, including Networked File System (NFS), remote access (telnet, rlogin), TCP/IP networking, SLIP and PPP dial-up access, and more. machine as a server or client for another network, in particular, file sharing (sharing) and remote printing works in Macintosh, NetWare and Windows.

Work on different hardware platforms

While Linux was originally developed for Intel 386/486 based PCs, it can now run on all versions of Intel microprocessors from the 386 to Pentium III multiprocessor systems (the Pentium IV had some issues, but according to reports, on the Internet, they were caused by errors in the implementation of the processor). (Note 3) Linux runs just as well on various Intel clones from other vendors; There are reports on the Internet that Linux works even better on AMD Athlon and Duron processors than on Intel. In addition, versions have been developed for other types of processors - ARM, DEC Alpha, SUN Sparc, M68000 (Atari and Amiga), MIPS, PowerPC, and others (note that only the version for IBM-compatible computers is considered in this book).

The number of private users of various versions of Linux "but cannot be accurately estimated - after all, distributions of this system, unlike fully commercial programs, can be obtained completely free of charge from friends or non-commercial distributors, as well as downloaded from FTP servers of the same companies that successfully trade Linux"om. Completely unthinkable in terms of ordinary capitalism, the scheme works and suits all participants.

Soon the fairy tale tells, but not soon the deed is done. The Linux operating system became known to the general public no more than two years ago. Those who have been interested in news from the world of high technologies for a long time and communicate from time to time with representatives of the "crazy programmer" breed (in the English version - geek or nerd), are familiar with the word Linux of the year since the 95th. Not surprisingly, Linux's current success on many fronts - from commercial to "ideological" - seems to many to be amazingly, incredibly fast. Type in the query box on altavista.com the word Windows -- and get 8,670,139 links. The word Linux will pop up 2,989,363. Eight months ago, the ratio was about 6,500,000 to 900,000. Pretty remarkable, isn't it? So where did this Linux come from and why is it successful? Who pulled the rope? What and why do we applaud? Let's go back for a second to thirty years and take a run - it will be easier that way. This whole story began long before the world knew about Linus Torvalds, the creator of perhaps the most successful programming project of the last decade. In 1971, a young programmer and researcher, Richard Stallman, began working at the famed Massachusetts Institute of Technology. In those days, in the era of "big computers", software was often developed by loose associations of programmers and freely transferred to other users who needed it. Often even large firms do this. Such a firm, for example, was AT&T, or rather, Bell Labs. She was forbidden to conduct commercial activities in the computer field, and therefore the developers of the Unix operating system Ken Thompson (Ken Thompson) and Dennis Ritchie (Dennis Ritchie) sent magnetic tapes with Unix "source codes" from their place of work only for the cost of consumables. By 1983, the situation had changed - the era of personal computers had come, commercial programs and operating systems (in particular, DOS from Microsoft) began their victorious march around the world, and the rust of self-interest penetrated the world of "big" machines and "serious" programming. And so Stallman, sad in his heart, founded the GNU project (www.gnu.org), whose goal was to bring back the good old days. GNU is a UNIX-compatible system that includes a set of "free" (or "open") software.

It is worth dwelling on the fundamental concept of "free" software in more detail. In the GNU manifesto, a lot of space is devoted to the difference between "free" programs and "free" programs - in Russian it can be said much shorter, since these concepts are not denoted, as in English, by one word "free". By borrowing or buying "free" software, you can:

copy as much as you like, distribute it as widely as you like;

modify or improve its source code (a program distributed under the GNU "public license" always comes with the developer's source code, the most closely guarded and never disclosed part of commercial software);

finally, you can freely dispose of the modified version - even give it away for free, even ask for a billion for it.

But one thing the user of such software is not entitled to in any case. He cannot, during further distribution, hide the source code of the program, declaring himself its "owner", and thus stop it, the program, free improvement and development. Especially for such programs, the GNU project introduced the concept of "copyleft" (as opposed to "copyright", where the creator of a product retains almost all copyright and proprietary rights to it under any circumstances - even if it distributes it completely free of charge). Clearly, the problem of piracy simply does not exist in the case of "free" software.

GNU still exists successfully today. No less successful is the GPL (General Public License) invented by Stallman, thanks to which Linux, invented by Linus Torvalds, won more than 20 million users in the four years of its existence.

By the end of last year, the number of web servers running this operating system exceeded the number of servers on the Windows platform. It is impossible to accurately estimate the number of private users - after all, Linux distributions, unlike completely commercial programs, can be obtained completely free of charge from friends or non-commercial distributors, as well as downloaded from FTP servers of the same companies that successfully trade Linux. the scheme, unthinkable in terms of ordinary capitalism, works and suits all participants.Linus Torvalds himself moved from Finland to the USA three years ago, to the city of Santa Clara in California, joining the mysterious Transmeta company (its microprocessors are a topic for a separate story). But Linus does not leave work on the Linux kernel "and Linus does not leave, new versions are released with enviable regularity, only now they help him in improving Linux" and - in full accordance with the principles of GNU - not tens, but thousands of developers around the world.

Linus will never be Bill Gates. Yes, he does not aspire - neither power nor money attracts him (at least in such a volume). However, in the nascent - what is there, already born - market of "free" software, it is unlikely that anyone will be able to get windfall profits in the style of Microsoft. No, let's put it this way - windfall profits are still possible as the demand for software only grows. But here they will be distributed a little differently.

So, it's time to take a closer look, almost under a microscope (as far as the scope of this publication allows) to consider the business model for the production and distribution of "free" software. At first, "blurred" look, it seems completely impossible, or at least completely unprofitable for the participants in the production process. In fact, there is absolutely nothing communist about it. The enthusiasm of developers, which at first glance seem completely altruistic, and the cries of delight on the part of consumers, which seem to be a manifestation of thoughtless fanaticism, are in fact based on sober economic calculation. And we will demonstrate it now. To begin with, it will be necessary to emphasize once again that there is no separate “Linux phenomenon” in nature. The success of Linux is just a special case of a fundamental and, quite possibly, irreversible change in the business model in the software industry. After all, Linus was not the only one who gave away his program for free and received 20 million users in return. One might recall a system administrator who worked for a company that made web servers - he just created a mailing list for "patches" to the free Apache server, which its creators stopped supporting. Several years later, this server is still free, it still doesn't have a "master" responsible for its technical support (and won't!), but it serves more than 60 percent of the planet's websites. And companies that have spent hundreds of millions of dollars advertising and promoting their commercial web servers are struggling to win back a market share comparable to that of the "no man's" Apache server. There are a whole bunch of such examples - after all, almost all the software standards on which the Internet is based are based on programs with "plain texts". So, the TCP / IP transport protocol, which belongs to no one, has long defeated the "closed" and owned by different corporations DECNET's, XNS's and other IPX's. Why go far - the world's most important computer standards committee consists of all and anyone who wants to join the mailing lists, meeting (who can) three times a year for special meetings.

Just a year and a half ago, it was useless to ask the head of a large Western company whether he uses "free" software with open source texts. That would make him angry! "It's not mature enough", "there is no technical support for it", "it has no commercial value". The Russian leader will say exactly the same thing now (the author of these lines is familiar with this from personal experience). But the western one is gone.

The first sign was in January 1998 by Netscape Communications, which published the source code for its flagship product, the Navigator browser (www.mozilla.org). Let me remind you that by that time Microsoft, which discovered the commercial potential of the Internet much later than its rival, began to exert powerful pressure on Netscape, rapidly ousting it from the market. The decision of the Netscape executives then seemed to many the last step, dictated by desperation. Later it turned out that this was the only right decision. True, the company itself is now owned by AOL Corporation, but the market share of Netscape's products - both servers and browsers - has remained virtually unchanged and still brings AOL profit.

A few months later, large corporations, pillars of the computer business (Corel, Informix, and Oracle), announced they were porting their products to the Linux operating system. Immediately following this, the mighty IBM included an "open" Apache server in one of its packages. Finally, in September '98, Intel, Netscape, and a few venture capitalists invested in Red Hat Software, which distributes Linux distributions (ten months later, Red Hat would be one of the biggest commercial successes of the year). Journalists watching these changes suddenly remembered that most mail on the Internet is sent using the "open" and free sendmail, the largest sites on the Ubiquitous Web use the "open" programming languages ​​Perl, Tcl and Python. And the most inquisitive observers advised readers that the entire Internet as a whole was originally created so that the community of independent developers could quickly and conveniently exchange program texts. So only the term "open source" ("open source") is relatively young (it was introduced two years ago by a well-known developer and apologist for "open" programs, Eric Reynolds), but the model for the presence of such software on the market is by no means new.

You can digress from the Internet (which is still a special environment) and remember how things were on the operating system market a quarter of a century ago. At that time, IBM (as Microsoft and some other companies are now, clinging with all their might to “closed” software with tightly classified source code) did not catch on time that the time of hardware and unique custom-made equipment had irrevocably passed, that the public was now more interested in “mass” software and "mass" hardware. During the heyday of IBM's power, the "requirement" to enter the business was unusually high, but with the advent of the personal computer platform and the release of the first "open" operating system (Unix), the rules changed dramatically, the threshold was lowered and green newcomers began to make the weather in the industry (that or Bill Gates), and the "closed" operating systems of the past, reigning supreme on the market (like Apollo), irretrievably sunk into the past. Development always goes in a spiral. Microsoft, having turned into a huge empire, behaved very imprudently, exactly like IBM many years ago - put up a high barrier to entry into the business and not only protected itself (temporarily) from competitors, but also deprived itself of freedom of maneuver. And then another round of the spiral happened - through the efforts of Reynolds, Torvalds and the leaders of Netscape, the concept of "open" software was introduced into use again, after a long break, which will become the gravedigger of many of the current titans, once again greatly lowering the threshold for "entry" into the business. And many of the titans understand this, going over to the side of the enemy. The fans of "open" software are not trying to beat Microsoft at its own game: instead, they are changing the very nature of the game. In the words of Tim O "Reilly, one of the most prominent ideologues of the open source movement, "our real mission is not to replace Microsoft, which dominates there, on desktop systems, but rather to build a business model that would resembled the slogan "Intel Inside", but for the next generation of computer applications."

What is this self-confidence based on? On the fact that in many situations the production of "open" and therefore in many cases free software turns out to be more profitable than "closed", with the source code protected more than the eye. Let us first give a theoretical example, fixing its educational impact with a number of practical ones. Let's say you've applied for a job at a firm that needs a specific program to handle, say, web-based payments. The fact of "opening" or "closing" the code will not change the problem of writing the actual code. It may make sense to keep it secret if you want to resell the program or are afraid that the employing company's competitors are using it. In this case, the first option is not very likely (90 percent of the programs are written by firms for internal use), and the second is worth exploring in more detail. Here it is worth deciding whether the minuses (in case a competitor began to use the “open” program) prevail over the pluses (gaining almost unlimited and long-term support from the programming community). Many will argue that "discovering" a program in this case means throwing away the money spent on its creation; however, this is a false connection, this money would have to be spent anyway. The same Eric Reynolds in his book "The Magic Pot" gives a real example of a situation of this kind - two programmers were hired by the well-known company Cisco to write a distributed print server. The job was done, and then they realized that they did not intend to stay at Cisco for life. Meanwhile, it is known that any program, especially a specific one, needs maintenance and support. And Cisco, of course, did not expect that the print server would stop a month after they left. And then they decided to convince the manager ... to publish the source code of the program on the Internet. He broke his convictions on his knee and allowed; in the end, Cisco was able to endure the departure of two programmers without loss - after all, now a number of companies and individuals used the print server, which could be contacted for help at any time.

And here is another classic example from a completely different area of ​​the software industry - gaming. And the game we are talking about is also a classic - Doom (Id Software). When it came out in 1993, it made economic sense for Id to keep the codes closed. Firstly, they were far ahead of their time and could not afford to issue ultra-modern weapons to competitors. Secondly, the program itself was not a so-called critical application; the business of buyers did not depend on it in any way. Thirdly, the games are not overly complex in principle, and the Id team could easily deal with bugs as they were discovered. But time passed, and competitors began to run out; there are many games that are very similar to Doom; market share has declined significantly. Doom had to be constantly updated, network support “fastened” to it, scripts were written, etc. The thing ended with the fact that in 1997 Id posted Doom source texts in the public domain. The firm's programmers got time to work on new games, and Id itself got the support of thousands of independent developers, a sharp increase in the number of users and the ability to earn money on the secondary market without much stress (selling script collections, etc.). Now, by the way, Doom is included in many standard Linux distributions.

Finally, the third classic example is the distributors of the Linux OS itself. This is, first of all, the Red Hat company, which occupies almost 90 percent of the US Linux market. Unlike the same Microsoft, this company does not sell software, does not “sell bits”. It distributes Linux (bundled with hundreds of other programs) for free, and charges solely for its branding, installation assistance, and promise of technical support, and solely from those willing to pay for it. After all, you can get all the same and absolutely free. Usually, a few days before the official release of the next version of Red Hat, it can already be downloaded from public FTP servers (note - absolutely legally!). Red Hat has nothing against this, otherwise it will immediately lose the support of independent developers. In return, the company gets much more - an explosive increase in the Linux market and millions of customers who need its services to support this operating system and its professional advice. Red Hat went public (i.e., floated its shares on the stock exchange) less than a year ago, and on the first day of trading, the value of the shares skyrocketed, and in just eight months, Red Hat shareholders became richer by 1900 percent (total company value reached $17 billion). This company is clearly making money from its backbone role in the market. And not only to her. As soon as the owners of the Salon.Com website announced that they would supply news for the Red Hat.Com website, their stock price doubled in a few hours. More recently, word has come that the hitherto unprofitable Linux server trader VALinux set an all-time high in US stock history, rising 711 percent on its first day of trading. Finally, the Andover.Net company, which owns the cult resources of Linuxoids Slashdot.Org and Freshmeat.Net, increased its value tenfold in half a year of trading on the stock exchange, until it was literally bought out at the moment these lines were written with a huge overpayment (almost for a billion dollars) the same VALinux, having received for its money several sites whose total income does not exceed 3 million dollars a year. Yes, what is it doing?

The usual story is that investors buy not so much their shares as their hopes for further explosive expansion of the market. And there is a price to pay for hope. Besides, it is possible that they will turn out to be right not only now (giant fortunes are already being made on speculation in Linux stocks), but also in the long term. Judge for yourself: back in 1952, the wonderful writer Robert Heinlein noticed that there are four ways to extrapolate an exponential curve. Take, for example, a hypothetical operating system codenamed Linux, which has been doubling its market share every six months for 10 years. What do the four alleged analysts have to say about this?

The conservative will predict that the status quo will continue for a number of years, and then a gradual decline to, say, 1 percent of the market.

A bold analyst would suggest a further increase, quickly fading away, and a stop at the 5 percent level.

A very, very brave analyst will decide that this company will see a linear increase in market share, which will give 5 percent in ten years, 10 percent in 20, and so on.

And finally, the only mathematically correct way to continue the exponent is to continue it. In this case, 2.5 percent of the market turns into 100 in two and a half years.

There is some truth in every joke - the Linux market share has crossed 5 percent and is approaching the coveted limit of 10. Usually at this moment fellow travelers start jumping on the bandwagon of the lucky one - this is the process we have been observing for the past few months. When the Hollywood company Digital Domain received a contract to develop special effects for the film Titanic, it became clear that its 350 SGI workstations (from the famous Silicon Graphics) were clearly not enough to fulfill the order. We bought 160 more machines on the Alpha platform, equipped with Linux, which, according to the results, were rated by specialists higher than the SGIs traditionally used for this kind of work. However, the results in this case could be assessed not only by specialists, but also by half of the world's population. And SGI recently announced full support for the Linux OS. Corporate members of the Linux International association are such respected players in the computer market as Sun, IBM Software Solutions, Compaq. Linux is installed on their servers by the largest manufacturers of computer equipment (for example, Hewlett Packard). Two leaders in the production of DBMS (database management systems) Oracle and Informix announced the software support for Linux. Giant Corel not only actively sells the WordPerfect package for Linux, but also distributes the operating system itself. In general, one has only to pronounce the word Linux in the NYSE stock exchange room, as everyone present immediately takes their breath away in anticipation of a miracle.

Describe the virtues linux can be long, but the description linux impossible without an explanation of what they are: free software; development linux; security linux.

What is linux

linux- multi-user network operating Unix-like system with network window graphics system X Window System. linux supports open systems standards and network protocols Internet and compatible with systems Unix, DOS, MS Windows. All components of the system, including source texts, are distributed with a license for free copying and installation for an unlimited number of users. linux Widespread on various platforms Intel and is gaining ground on a number of other platforms ( DEC AXP, Power Macintosh and etc.).

Development linux performed by Linus Torvalds ( Linus Torvalds) from the University of Helsinki and an uncountable team of thousands of netizens Internet, employees of research centers, foundations, universities, etc.

Free Software

Core linux, the main components of the system, most user applications are free software. They can be run on any number of computers, transferred without restrictions for money or free of charge, receive the source texts of these programs and make any corrections to them.

The freedom of software has ensured its widespread use and interest in it from thousands of developers. Main programs for linux released under license GNU General Public License, which not only guarantees freedom, but also protects it, allowing further distribution of programs only under the same license. So the kernel code linux, compilers, libraries glibc, custom shells KDE And GNOME cannot be used to build closed source applications. This is the fundamental difference linux from free OSBSD (FreeBSD, NetBSD, OpenBSD), fragments of which are included in the family Windows and even became the basis MacOS X. linux includes many developments BSD, but compilers and system libraries are developed as part of the project GNU.

Development linux

Unlike Windows, MacOS and commercial UNIX- similar systems, linux does not have a geographic development center. There is no company that owns this OS; there is not even a single coordinating center. Programs for linux- the result of the work of thousands of projects. Some of these projects are centralized, some are concentrated in firms, but most bring together programmers from around the world who know each other only through correspondence. Anyone can create their own project or join another and, if successful, the results of the work will become known to millions of users. Users take part in testing free software, communicate directly with developers, which allows them to quickly find and fix bugs and implement new features.

It is this flexible and dynamic development system, which is impossible for closed source projects, that determines the exceptional cost-effectiveness linux. Low cost of free development, well-established testing and distribution mechanisms, attraction of people from different countries with different visions of problems, protection of the code by a license GPL, all of which have contributed to the success of free software.

Of course, such a high development efficiency could not fail to interest large companies that began to open their projects. So there were Mozilla (Netscape, AOL), openoffice.org (Sun), free clone Interbase (Borland), SAP DB (SAP). IBM contributed to the transfer linux to their mainframes.

On the other hand, open source significantly reduces the cost of developing closed systems for linux and allows you to reduce the price of the solution for the user. That's why linux has become a platform often recommended for products such as Oracle, DB2, Informix, SyBase, SAP R3, Domino.

security

Linux OS inherited from UNIX reliability and excellent protection system. The file access control system allows you not to be afraid of many viruses that terrorize the world Windows OS. However, there are no programs without errors, and linux is no exception. However, due to the openness of the source code of the programs, any specialist can carry out its audit without non-disclosure agreements and the need to work within the walls of the company that hired him. That is why protection errors are detected particularly efficiently and quickly corrected. The mechanism for notifying and correcting security errors was created by the community linux, it involves specialists from development companies and independent programmers.

Opportunities provided Linux OS

Linux OS:

  • makes it possible to have a modern operating system free of charge and legally;
  • has high speed;
  • works reliably, stably, completely without freezes;
  • not susceptible to viruses;
  • allows you to fully use the possibilities of modern PC removing the limitations inherent DOS And MS Windows on the use of machine memory and processor(s) resources;
  • effectively manages multitasking and priorities, background tasks (long calculation, modem e-mail transmission, floppy disk formatting, etc.) do not interfere with interactive work;
  • allows you to easily integrate your computer into local and global networks, incl. in Internet; works with networks based on Novell And MS Windows;
  • allows you to run the applications presented in the download format of other OS- various versions Unix, DOS And MS Windows;
  • ensures the use of a huge number of various software packages accumulated in the world Unix and freely distributed along with the source texts;
  • provides a rich set of tools for developing application programs of any complexity, including client-server class systems, object-oriented, with a multi-window text or graphical interface, suitable for both linux, as well as in others OS;
  • gives the user and especially the developer a wonderful training base in the form of rich documentation and source codes for all components, including the core of the OS;
  • gives everyone who wants to try their hand at development, organize communication and collaboration through Internet with any of the developers Linux OS and make your contribution by becoming a co-author of the system.

Who and why might need Linux OS

In use linux for various reasons, users of different categories may be interested. It is impossible to provide an exhaustive list. However, here are some examples:

  • linux- full 32-bit (64-bit on the platform DEC AXP) operating system that uses the computer to its full potential. linux easily turns a personal computer IBM PC workstation;
  • the price gain is very large, since in addition to saving on hardware, software in linux comes with a free license allowing free unlimited copying of the system;
  • winnings on software alone can range from thousands to tens of thousands of dollars. For many users in Russia, a free license is the only way to legally equip themselves with a complete set of software;
  • big interest linux presents for consumers and developers of application systems that require: a multi-platform compiler, a powerful multi-window debugging system, emulators and compatibility systems. All this is in linux;
  • scientists and technical writers linux offers:
    • document processor LyX, which uses the concept of logical design and allows you to create beautiful, well-structured documents;
    • publishing systems TeX And Scribus.

OS main difference linux from operating systems of the family Windows is a qualitatively different type of file system organization. If the OS Windows the user is dealing with a disk file system, i.e. accesses logical drives C, D, E… and uses filesystem types like FAT16, FAT32, NTFS, then in operating systems UNIX families, including linux, no logical drives. File system organization linux built on the distinction between so-called partitions ( partitions) - parts of the total space of the hard disk (hard drive), access to data on which is possible by selecting the assigned directory of the file system, while the types of file systems are EXT2 And EXT3. The difference between these file systems lies in the different level of data storage security. Flexible way to assign partitions linux implements effective security management of the operating system, allowing some users to assign access to data, while others are not.

modern operating systems. Advantages, qualitative comparison and functional features of OS

Operating Systems UNIX-families satisfy all the requirements imposed on them by VLSI development tools: this is the possibility of providing simultaneous access to development tools to multiple users; high speed of information processing; availability of a set of text, graphic editors, tools for reading help files and internal CAD documentation; a set of programs for network access to the Internet and other networks; a set of tools for remote work with CAD; a wide variety of useful programs and utilities for working with audio, video and photo objects, as well as much more. In addition, an important component of the operating systems of this family is the presence of a command interpreter capable of processing user command requests and outputting the information received to the monitor screen, to a file, or to a device.

OS development process UNIX carried out by computer professionals working in a scientific environment. As UNIX developed, it gave rise to various operating systems, among which the most famous are linux, Solaris And FreeBSD. Each of these systems began to develop individually, but retained all the advantages UNIX.

The most used operating system of the family UNIX is an linux. This platform has a friendly graphical interface that allows the user to more comfortably manage the OS using not only the keyboard, but also a mouse device. Among the main advantages of Linux it should be noted:

1. flexibility;

2. power;

3. stability;

4. multitasking mode;

5. multiplayer mode;

6. high level of security;

7. user-friendly graphical interface;

8. a large number of text and graphics programs;

9. a set of client and server applications;

10. the presence of an "open license".

OS linux conveniently combines the high performance of the system and the ability to fine-tune the elements of the system to the needs of a particular developer. Qualified configuration of the operating system functions allows it to work around the clock for a long time without interruption. The presence of multi-user and multi-tasking modes, together with a high level of OS security, allows developers to use many CAD applications at the same time, without the threat of losing or corrupting their data. High stability, along with the high power of the operating system, make it possible to reliably serve user requests on a medium-sized hardware and computing platform.

An equally important advantage linux is the presence of the so-called "open license" for most distributions. Although the source codes of some software are distributed openly and free of charge, they are protected by public licenses, which exclude the right of commercial companies to slightly modify them and declare copyright on these changes, and then take control of the software product and sell it as their own. The most popular is the public license GNU Public License provided by the Free Software Foundation ( Free Software Foundation). The Linux operating system is distributed under this license. Public License GNU reserves copyright to the developer, but guarantees free use of the software, provided that the software itself and all additions and changes to it will always remain freely distributed.

When working with linux The VLSI developer has the ability to launch programs both from the main computer, at which he is directly located, and from a remote computer, working with it through his own terminal. IN linux it is possible to customize the required type, size and color of fonts at will; background color of the terminal window; command prompt display form; set the desired variables and aliases. The developer can control the processes, change the priority between them as necessary, and remove some.

An operating system is a set of programs that manages computer hardware, organizes work with files (including launching and managing program execution), and also implements user interaction, i.e., interpreting commands entered by the user and displaying the results of processing these commands .

Without an operating system, a computer cannot function at all as such. In this case, it is nothing more than a collection of non-working electronic devices, it is not clear why they were assembled together.

Today, the most well-known operating systems for computers are the Microsoft Windows and UNIX families of operating systems. The first lead their lineage from the MS-DOS operating system, which was equipped with the first personal computers from IBM. The UNIX operating system was developed by a group of Bell Labs employees led by Dennis Ritchie, Ken Thompson, and Brian Kernighan in 1969. But these days, when people talk about the UNIX operating system, most often they don't mean a specific OS, but rather a whole family of UNIX-like operating systems. The word UNIX itself (in capital letters) has become a registered trademark of the ATT Corporation.

In the late 1970s (now the last century), the staff at the University of California at Berkeley made a number of improvements to the UNIX source code, including work with the TCP/IP family of protocols. Their development became known as BSD ("Berkeley Systems Distribution"). It was distributed under a license that allowed you to modify and improve the product, and distribute the result to third parties (with or without source codes), provided that it was indicated how much of the code was developed at Berkeley.

Operating systems such as UNIX, including BSD, were originally designed to run on large multi-user computers - mainframes. But personal computers have gradually increased the power of their hardware, and today they already surpass the capabilities of the mainframes for which UNIX was developed in the 70s. And so, in the early 90s, a student at the University of Helsinki Linus Torvalds (Linus Torvalds) began to develop a UNIX-like operating system for IBM-compatible personal computers.

1.1.2 A bit of history

Here is the text of the message that Torvalds sent to the comp.os.minix newsgroup on August 25, 1991:

From: [email protected](Linus Benedict Torvalds)

Newsgroups: comp.os.minix

Subject: What would you like to see most in minix?

Summary: small poll for my new operating system

Organization: University of Helsinki

hello everybody out there using minix

I "m doing a (free) operating system (just a hobby, won't be big and professional like gnu) for 386(486) AT clones. This has been brewing since april, and is starting to get ready. I "d like any feedback on things people like/dislike in minix, as my OS resembles it somewhat (same physical layout of the file-system (due to practical reasons) among other things).

I"ve currently ported bash(1.08) and gcc(1.40), and things seem to work.

This implies that I "ll get something practical within a few months, and I" d like to know what features most people would want. Any suggestions are welcome, but I won't promise I'll implement them :-)

Linus ( [email protected])

PS. Yes - it's free of any minix code, and it has a multi-threaded fs. It is NOT portable (uses 386 task switching etc), and it probably never will support anything other than AT-harddisks, as that's all I have :-(.

In this message, Linus writes that he is working on a (free) operating system for 386 (486) computers, and asks all interested parties to report which system components users want to see first. But, as you can see from the text of the message, the bash shell and the gcc compiler already worked for him. They worked under the control of the Minix operating system, which was developed by Professor E. Tanenbaum (Andy Tanenbaum) as a textbook for student programmers. Minix ran on computers with the 286th processor and served as the prototype for the new operating system for Torvalds.

The files for the first variant of Linux (version 0.01) were released to the Internet on September 17, 1991. As Torvalds himself writes: "As I already mentioned, 0.01 didn"t actually come with any binaries: it was just source code for people interested in what linux looked like. Note the lack of announcement for 0.01: I wasn"t too proud of it, so I think I only sent a note to everybody who had shown interest."

Then, on October 5, 1991, version 0.02 was released and was already running. However, a detailed presentation of the history of Linux is beyond the scope of this book, so I will not continue this topic, referring interested readers to [A3.1].

L. Torvalds did not patent or otherwise restrict the distribution of the new OS. From the outset, Linux has been distributed under the terms of the General Public License (GPL) for software developed under the Open Source movement and the GNU Project (see [A3.2]). In Linux slang, this license is sometimes referred to as Copyleft. This license, the Open Source movement, and the GNU project deserve special mention.

In 1984, American scientist Richard Stallman founded the Free Software Foundation. The purpose of this fund was to eliminate all prohibitions and restrictions on the distribution, copying, modification and study of software. After all, until then, commercial companies carefully guarded the software they developed, protected it with patents and copyright protection marks, kept the source codes of programs written in high-level languages ​​(such as C ++) in the strictest confidence. Stallman believed that this causes great harm to software development, leads to a decrease in the quality of programs and the presence of a huge number of undetected errors in them. And, worst of all, it slows down the exchange of ideas in the field of programming, slows down the creation of new software due to the fact that each programmer has to completely re-write each program, instead of borrowing ready-made pieces of source code from ready-made programs.

The Free Software Foundation started the development of the GNU project, the free software project. The abbreviation GNU opens recursively - GNU "s Not Unix, i.e. what belongs to the GNU project is not part of Unix (because by that time even the word UNIX itself was already a registered trademark, i.e. ceased to be free) In The GNU Manifesto [A3.3], which was written in 1985, R. Stallman cites his rejection of individual people's ownership of software as the main driving force behind the emergence of the FSF and the GNU project.

Just because software developed by the GNU Project is free, it doesn't mean that it's distributed without a license and isn't protected in any way in the legal sense. Programs developed as part of the Open Source movement are distributed under the terms of the General Public License (GPL) [A3.2]. To put it very briefly, the essence of this license is as follows. The software distributed under this license can be modified, modified, transferred or sold to other persons in any way, provided that the result of such processing will also be distributed under a copyleft license. The last condition is the most important and defining in this license. It ensures that the results of free software developers' efforts remain open source and do not become part of any conventionally licensed product. It also distinguishes free software from free software. In the words of the creators of the FSF, the GPL "makes software free and ensures that it stays free."

Virtually all GPL software is almost free to users (in most cases, you only have to pay for the software CD-ROM or Internet traffic to get it). This does not mean that programmers stop getting paid for their work. The main idea of ​​R. Stallman is that it is necessary to sell not software, but the work of a programmer as such. For example, a source of income may be the maintenance of software products or their installation and configuration for deployment on new computers and/or in new conditions, teaching, etc. A good reward may also be the achievement of a certain fame by the author of free software, which will allow him to subsequently receive high paying job.

As part of the Open Source movement, and in particular the GNU project, a significant number of programs have been developed, the most famous of which are the Emacs editor and the GCC compiler (GNU C Compiler) - the best compiler of the C language to this day. The openness of the source codes of programs has a very beneficial effect on software quality: all the best, all new ideas and solutions are immediately widely distributed, and all errors are noticed and quickly eliminated. The mechanism of natural selection begins to work, which is suppressed in the version of the approach to software distribution that is practiced in commercial software.

But back to the history of Linux itself. I must say that the development of Linus Torvalds was only the core of the operating system. This kernel "fell on prepared soil" in the sense that the GNU project had already developed a large number of utilities of various kinds. But to turn GNU into a full-fledged OS, a kernel was not enough. The development of the kernel was underway (it was called Hurd), but for some reason it was delayed. Therefore, the appearance of the development of L. Torvalds was very timely. It marked the birth of the open source operating system.

R. Stallman is of course right when he insists that the Linux operating system should be called GNU/Linux. But it so happened that the name of the kernel became the name of the entire operating system, and we will do the same in this book.

1.1.3 Key Features of Linux OS

Due to the fact that Linux source codes are freely distributed and available to the public, a large number of independent developers joined the development of the system from the very beginning. Thanks to this, Linux is currently the most modern, stable and rapidly developing system, absorbing the latest technological innovations almost instantly. It has all the features that are inherent in modern full-featured operating systems such as UNIX. Here is a short list of these possibilities.

Real multitasking

All processes are independent; none of them should interfere with other tasks. To do this, the kernel implements a time-sharing mode of the central processor, alternately allocating time intervals for each process to execute. This is quite different from the "preemptive multitasking" mode implemented in Windows 95, where the process itself must "yield" the processor to other processes (and can greatly delay their execution).

Multi-User Access

Linux is not only a multitasking OS, it supports the ability of many users to work at the same time. At the same time, Linux can provide all system resources to users working with the host through various remote terminals.

Swapping RAM to disk

Swapping RAM to disk allows you to work with a limited amount of physical RAM; To do this, the contents of some parts (pages) of RAM are written to a dedicated area on the hard disk, which is treated as additional RAM. This slightly slows down the speed of work, but allows you to organize the work of programs that require more RAM than is actually available in the computer.

Memory paging

Linux system memory is organized in 4K pages. If the RAM is completely depleted, the OS will look for long-unused memory pages to move them from memory to the hard disk. If any of these pages become needed, Linux restores them from disk. Some older Unix systems and some modern platforms (including Microsoft Windows) flush to disk all RAM content related to a currently idle application (i.e. ALL memory pages related to an application are saved to disk when out of memory) which less efficient.

Loading executable modules "on demand"

The Linux kernel supports on-demand page allocation, in which only the necessary part of the code of the executable program is in RAM, and the parts that are not currently in use remain on disk.

Sharing executable programs

If it is necessary to run several copies of an application at the same time (either one user launches several identical tasks, or different users launch the same task), then only one copy of the executable code of this application is loaded into memory, which is used by all simultaneously executing identical tasks.

Shared Libraries

Libraries are sets of procedures used by data processing programs. There are a number of standard libraries used by more than one process at the same time. In older systems, such libraries were included in every executable file, the simultaneous execution of which led to an unproductive use of memory. Newer systems (Linux in particular) provide support for working with dynamically and statically shared libraries, which can reduce the size of individual applications.

Dynamic disk caching

Disk caching is the use of part of the RAM to store frequently used data from the disk, which significantly speeds up access to frequently used programs and tasks. MS-DOS users work with SmartDrive, which reserves fixed areas of system memory for disk caching. Linux uses a more dynamic caching system: the memory reserved for the cache increases when the memory is not in use, and decreases if the system or user process needs more memory. 100% POSIX 1003.1 compliant.

Partial support for System V and BSD features

POSIX 1003.1 (Portable Operating System Interface) defines a standard interface for Unix systems, which is described by a set of C language routines. Now it is supported by all new operating systems. Microsoft Windows NT also supports POSIX 1003.1. Linux is 100% POSIX compliant. Additionally, some System V and BSD features are supported to increase compatibility.

System V IPC

Linux uses IPC (InterProcess Communication) technology to exchange messages between processes, use semaphores and share memory.

Ability to run executable files of other operating systems

Linux is not the first ever operating system. For previously developed operating systems, including DOS, Windows 95, FreeBSD or OS / 2, a lot of different software has been developed, including very useful and very good software. DOS, Windows 3.1 and Windows 95 emulators have been developed to run such programs under Linux. Moreover, Vmware has developed a system of "virtual machines", which is a computer emulator that can run any operating system. There are similar developments in other companies. Linux is also capable of running binaries from other Intel-based Unix platforms that conform to the iBCS2 (intel Binary Compatibility) standard.

Support for various file system formats

Linux supports a large number of file system formats, including DOS and OS/2 file systems, as well as modern journaling file systems. At the same time, Linux's own file system, called the Second Extended File System (ext2fs), allows for efficient use of disk space.

Networking

Linux can be integrated into any local network. All Unix services are supported, including Networked File System (NFS), remote access (telnet, rlogin), TCP/IP networking, SLIP and PPP dial-up access, and more. machine as a server or client for another network, in particular, file sharing (sharing) and remote printing works in Macintosh, NetWare and Windows.

Work on different hardware platforms

While Linux was originally developed for Intel 386/486 based PCs, it can now run on all versions of Intel microprocessors from the 386 to Pentium III multiprocessor systems (the Pentium IV had some issues, but according to reports, on the Internet, they were caused by errors in the implementation of the processor). Linux also runs successfully on various Intel clones from other manufacturers; There are reports on the Internet that Linux works even better on AMD Athlon and Duron processors than on Intel. In addition, versions have been developed for other types of processors - ARM, DEC Alpha, SUN Sparc, M68000 (Atari and Amiga), MIPS, PowerPC and others (note that in this book only the version for IBM-compatible computers is considered).

1.2. Linux distributions

There are 4 main parts in any operating system: kernel, file structure, user command interpreter and utilities. The kernel is the main, defining part of the OS that manages the hardware and the execution of programs. The file structure is a system for storing files on storage devices. A command interpreter or shell is a program that organizes user interaction with a computer. And, finally, utilities are just separate programs that, generally speaking, do not fundamentally differ from other programs launched by the user, except for their main purpose - they perform utility functions.

As mentioned above, to be precise, the word "Linux" refers only to the kernel. Therefore, when talking about an operating system, it would be more correct to say "an operating system based on the Linux kernel". The Linux kernel is developed under the general supervision of Linus Torvalds and distributed freely (based on the GPL license), like a huge amount of other software, utilities and applications. One of the consequences of the free distribution of software for Linux was that a large number of different firms and companies, as well as simply independent development groups, began to release so-called Linux distributions.

A distribution kit is a set of software that includes all 4 main components of the OS, i.e., the kernel, the file system, the shell and a set of utilities, as well as a certain set of application programs. Usually, all software included in a Linux distribution is distributed under the terms of the GPL, so it may seem that anyone can release the distribution, or rather anyone who is not too lazy to collect a collection of free software. And there is some degree of plausibility in such a statement. However, the distribution developer must at least create an installer that will install the OS on a computer that doesn't already have an OS. In addition, it is necessary to ensure the resolution of interdependencies and contradictions between different packages (and versions of packages), which, as we will see later, is also a non-trivial task.

However, there are already over a hundred different Linux distributions in the world, and new ones are popping up all the time. A more or less complete list of them can be found on the http://www.linuxhq.com server, where brief characteristics are given for each distribution kit (some localized versions are also mentioned). In addition, there are also links to other lists of distributions, so if you wish, you can find everything that exists in the world at all (although all this is in English, and Russian localizations are not mentioned there enough).

A. Fedorchuk in the article [A3.8] made an attempt to classify distributions based on the following criteria:

The structure of the file system;

installation program;

The software package installer used;

Composition of utilities and application programs included in the distribution.

Although A. Fedorchuk comes to the conclusion that the differences between distributions are insignificant and are increasingly blurred, it still follows from his article that today at least 3 groups of distributions are distinguished, the most typical representatives of which are Red Hat, Slackware and Debian .

By what criterion to choose a distribution kit? In my opinion, for the case of our country, there are two criteria: the distribution must be Russified and there must be a development team that provides support for the distribution. And it is better if this team has some income from this (or, perhaps, some other) activity, that is, it functions as a commercial firm. Even during the relatively short period during which I have been involved in Linux, several distributions managed to leave the scene, the support teams of which worked "on a voluntary basis" and after a while stopped supporting their developments.

In Russia, three teams of developers have recently formed, creating and supporting Russified distributions.

One of the teams was formed at the Institute of Logic (http://www.iplabs.ru). For some time, this team was engaged in the Russification of the Linux distribution Mandrake Russian Edition, and in March 2001 organized the company "ALTLinux" (http://www.altlinux.ru) and released its own distribution kit ALTLinux (which, however, is very similar to Linux Mandrake Russian Edition).

The second command is provided by "ASPLinux" (http://www.asplinux.ru, http://www.asp-linux.com, http://www.asp-linux.com.sg, http://www. asp-linux.co.kr), which also released its own ASPLinux distribution. This team included L. Kanter and A. Kanevsky, who previously released the well-known Black Cat Linux distribution.

The third team, as far as I can tell, is represented by the St. Petersburg company “Linux Ink.” (http://www.linux-ink.ru), which releases Red Hat Linux Cyrillic Edition.

Of course, there are other Russified distributions. In 2000, Best Linux distributions (http://bestlinux.net), supported by SOT from Finland, and RosLinux appeared. A description of several Russified Linux distributions is given in A. Fedorchuk's book [A1.6]. But, in my opinion, if we talk about choosing a distribution, then today only three distributions deserve attention: Red Hat Linux Cyrillic Edition, Linux Mandrake Russian Edition (and its descendant ALTLinux) and ASPLinux. I can give the following reasons for this choice:

These distributions belong to a family of distributions based on Red Hat Linux, produced by the American company of the same name, and judging by the materials on the Internet, Red Hat is the most widespread distribution in the world.

These distributions are initially Russified.

Each of them has a fairly streamlined installation procedure that automatically recognizes most of the hardware components, which greatly simplifies the system installation procedure.

It is easy to install (add) additional software because it comes in RPM packages (this is a software distribution technology, like the setup program under Windows).

These distributions are supported by established development teams and are constantly updated, so you can be sure that you will be able to work with the latest versions of Linux.

A few words about version numbering. A distinction must be made between distribution version numbers and kernel version numbers. When talking about Linux versions, they usually mean the kernel version (because the operating system belongs to Linux is determined by the fact that the OS uses the Linux kernel). Since Linus Torvalds continues to coordinate the development of the kernel, the kernel versions evolve sequentially, rather than branching and multiplying like distributions.

Linux kernel versions are usually denoted by three numbers separated by a dot. For example, the Black Cat distribution version 5.2 was built on top of kernel version 2.0.36, i.e. it was Linux version 2.0.36. Kernel versions with an odd second digit are usually not used to create distributions because they are experimental (debug). They are distributed mainly so that enthusiasts can test them in order to identify all the shortcomings. Naturally, such a version may work unstable. Versions with an even second digit are (considered) stable. Of course, you can install any version, but for beginners, it is still usually recommended to choose a kernel version with an even second digit in the version number. Of course, if you're installing a full distribution, the choice of kernel is made for you by its developers, but version numbering is something you should be aware of if you ever consider updating the kernel.

1.3. Computer Requirements

I've come across mentions that there are special versions of Linux that run even on an 8086 processor with 512 KB of memory, and a specially built version can run from one or two floppy disks without a hard drive.

So, if you have an old computer that does not run any Windows, then you can successfully use it to master Linux and you may be surprised by its capabilities. But such options are not considered in this book.

Because Linux OS uses microprocessor protected mode, at least a 386th processor is required to install this OS. Judging by the literary sources, any modifications are suitable: SX, DX, etc. Further requirements for the hardware of the computer on which Linux is installed are determined by what you want. From the table below. 1.1. you can see how the requirements for the hardware are increasing depending on the wishes of the user (the numbers given in the table are very approximate, here I do not pretend to be the ultimate truth).

Table 1.1. Hardware Requirements

User wishes Memory Requirements Hard disk space requirements
Minimum requirements: work in text mode from the shell command line 4 MB 10 MB
Working in text mode via Midnight Commander 4 MB 40 MB
To launch the X Window GUI 8 MB, but will work very slowly, 16 MB is more or less acceptable
To work with the X Window GUI (launching the window manager) 16 MB 300 MB
To run the KDE integrated graphical environment 32 MB 500 MB
To run each single large application (such as GIMP, word processor, database, or spreadsheet) +2 MB +50-100 MB
To work with the integrated office suite StarOffice 64 MB +250 MB

From this table, we can conclude that the minimum acceptable configuration for mastering Linux is a computer on the 486th processor with 16 MB of RAM and a 300 MB hard disk. Next, you only need to take care of increasing the RAM and hard disk space, there will never be anything superfluous.

Again I will refer to the book by A. Fedorchuk [P1.6], in which a large chapter is devoted to choosing a hardware platform for Linux. In it, the author takes a detailed look at how Linux relates to every piece of computer hardware, from the chipset and motherboard to peripherals and uninterruptible power supplies. However, in my opinion, in practice, the choice of a computer is determined not by the operating system, but, first of all, by the material capabilities of the owner. And it is necessary to attribute to the advantages of the OS its ability to manage not only the latest and "fancy" models, but also already "out of fashion" or "obsolete" copies. After all, the so-called "moral obsolescence" is precisely due to the fact that new versions of software from the most famous manufacturers are forced to write off fully functional equipment as scrap. In this sense, Linux has the great advantage of being able to run even on computers where only MS DOS can be an alternative. Of course, in such cases we will only get the command line mode, but judging by various sources on the Internet, this does not prevent us from using old computers to perform various auxiliary tasks, for example, as routers.

But the questions of using Linux for these purposes do not fall within the scope of our interest. If we talk about a typical user, then, judging by my experience, if you can work on a computer with Windows 95, and even more so with Windows NT or Windows 2000, then such a computer is quite suitable for running Linux.

1.4. Where to get Linux?

And, in conclusion of the first chapter, a brief answer to the question formulated in the title of the section.

As mentioned, Linux, along with a huge number of application programs, is distributed almost free of charge. This means that a user who does not intend to modify the software or sell it has every right to copy the entire Linux distribution or any parts of it from a friend, download from the Internet, or buy a Linux CD-ROM from merchants in the underpass, without fear that will be prosecuted for violating license requirements (which for some reason are called "agreements") issued by the developer.

Of the three options listed for purchasing the distribution, I would suggest purchasing it on CD-ROM. It is advisable to buy not in an underground passage (although I bought my first distribution kit in the local market and did not regret it), but in one of the computer companies or through an online store. This gives you a choice and some guarantees, at least for the exchange of a defective disk. Just keep in mind that the price range can be very large. There are beautiful packages with a price of more than 1000 rubles (and it is the right of the seller to set the price). And the same distribution (maybe only without a printed installation guide) can be bought for a hundred or two.

I myself have recently been using the services of online stores. I do not indicate a specific address (advertising is now paid), but there are now many of them, so the absence of a specific address here is not an obstacle for those who want to purchase a distribution kit.

Notes:

“As I mentioned, version 0.01 was distributed without binaries: they were just source codes intended for those who are interested in what linux looks like. Please note that there was no announcement of version 0.01: I was not very proud of it, so I just sent a message to everyone who showed some interest.

(comment sent by V. Sinitsyn, Linux Center) From the very beginning, the Linux kernel was distributed under a license that the FSF would not recognize as free at all, since it prohibited commercial distribution. Its text can be found in the archives of early versions of the kernel at ftp.kernel.org (see, for example, ftp://ftp.kernel.org/pub/linux/kernel/Historic/old-versions/RELNOTES-0.01). The license change seems to have taken place in version 0.12 (see ibid., RELNOTES-0.12).

These difficulties have now been overcome, and Linux runs successfully on all processors from Intel.

It can be noted that with each new version of the kernel, the requirements for hardware configuration are increasing. The data in Table 1.1 refer to the Black Cat Linux 5.2 distribution. As my experiments on installing Red Hat Linux 9 in the minimum configuration (see my articles and translations page) show, modern distributions already require at least 600 MB of disk space and 64 MB of memory.

Similar posts