Categories
Computing Operating Systems Research

The conflicting, dual definitions on the purpose of an Operating System

I was posed this question in my studies. An operating system is said to have two, conflicting definitions of purpose:

  1. Presenting a virtual machine with a user-friendly GUI to a user which isolates them from the low-level hardware
  2. It must manage, efficiently, the limited resources of the hardware system

Firstly I tend to agree that the two conflict in nature. If we think of the operating system on the whole and then think of the goal of efficiently managing resources it will not take long to realise that the nature of an operating system is not entirely geared towards efficiency. On the surface the limited resources of the computer are often wasted on things like graphical effects and enhancements like transparency and animation. Often the resources required to perform these seemingly meaningless tasks are very taxing on the system hardware.

If I look at the question again, the phrase “manage in the most efficient way the (always) limited resource of the computer system” (University of Liverpool, 2010), I would say that it does manage the resources of the operating system quite efficiently, because, regardless of the task at hand, the computer manages to operate quite smoothly when doing its multi-tasking and just looking at the task manager in Windows 7 you are able to see just how many tasks are running at any one time, and the computer still operates responsively and seemingly effortlessly. What I’ve just mentioned does entirely depend on the spec of hardware that your computer is running, RAM and CPU speed etc. but if you follow the minimum requirements performance is generally as it is expected.

My conclusion is that I do think that they work together quite effectively as, in this specific answer of mine, Windows 7 as an operating system is very user friendly while still managing the computers resource efficiently and quickly (despite its predecessor, Vista, which did not manage resources as well). The concept of a process is absolutely vital to the success of both managing the hardware efficiently and providing a user friendly environment. A process is defined as a dynamic activity who’s properties change as time progresses (Brookshear, J, pp.134), coupled with multiprogramming is a way in which different activities and resources are managed and organised, without this there would be chaos and I believe the computer would be sent back to the days of batch processing single tasks.

References

Brookshear, J.G (2009) Computer Science: An Overview. 10th ed. China: Pearson Education Asia Ltd.

Categories
Computing Development Operating Systems

My current “IT” setup, is it performing adequately?

As a desktop operating system I am using Windows 7 Professional (64bit). The networks I use at both my home office and work office are a combination of Ethernet/Bus and Wifi.  I do believe it is worth mentioning that all my ‘test’ and ‘live’ environments that I work on (aside from some of my testing being done on my PC/Laptop itself) are Linux based systems, specifically Ubuntu Linux and CentOS Linux as well as dealing with my immediate colleagues who are running Apple Mac.

I believe that the systems are performing to the baseline of their expectations in some degree and doing exactly what they need in another degree. Speed being a major drawback, and permissions across shared drives/folders being more of an inconvenience than a show-stopper.

From an operating system perspective, I do feel that Windows is lacking in its command line shell when compared to its Unix based counterparts. The ease of installing server based software for the work that I do (Apache, PHP, MySQL, SSL) on some parts are very easy but for more advanced setups such as SSL with Apache on a Windows machine is not as easy as it is on Linux. The pros of Windows I believe is the vast availability of software, the lack of being tied to specific hardware (such as with Apple) and the ease of use is very straightforward (especially when compared to Linux), as well as how easy it is to add/remove software without it being provided through repositories specifically made for the OS (such as with Apple and Linux).

As far as the network goes, when dealing with data streaming or copying large amounts of data or just working with a network drive (especially when using SVN across a repository with 1000s of files), Wifi falls very short of what I personally would accept as efficient performance. Currently we are running 100MBIT LAN to help alleviate this but then there is another bottleneck of our network server being quite old, but that does not really matter in this specific situation.

Due to working with multiple programmers, the disadvantage of the slowed network drive performance does outweigh each developer developing locally and doing an SVN commit with each code change to check on the  test server (our network drives are located on the test server), this allows us to edit directly on the test server and see our changes instantly without committing and updating the test servers code base. Although the time it takes to commit all our changes to live from our network drive is far longer than it should be. I would say the positive is still outweighing the negative.

At the moment our internal office network is secured by a basic ADSL router, our Wifi by the built in router wireless security and all of our work done on the test and live servers are done over SSH. I do believe that for the purposes of our organisation the security is sufficient. While one could look at it as the “standard” level of security, we do not deal with vital user information in our local network. Each computer is equipped with software antivirus and firewalls that update daily. I personally believe that overdoing the security, while it does lower your risk, can be overkill if you are not dealing with mission critical data (such as banking information).

If I were to modify just one element I would designate more responsibility to our test server as at the moment it is serving as the DHCP server while the DNS and routing/gateway is the router. To centralise the DNS being handled on the test server as well it would alleviate us needing to alter our hosts file when working remotely to point to the test servers external IP instead of its internal LAN IP. If DNS was handled by the test server the IP address designation could be altered at the source instead of each user computer.