Categories
Computing Development Operating Systems

My current “IT” setup, is it performing adequately?

As a desktop operating system I am using Windows 7 Professional (64bit). The networks I use at both my home office and work office are a combination of Ethernet/Bus and Wifi.  I do believe it is worth mentioning that all my ‘test’ and ‘live’ environments that I work on (aside from some of my testing being done on my PC/Laptop itself) are Linux based systems, specifically Ubuntu Linux and CentOS Linux as well as dealing with my immediate colleagues who are running Apple Mac.

I believe that the systems are performing to the baseline of their expectations in some degree and doing exactly what they need in another degree. Speed being a major drawback, and permissions across shared drives/folders being more of an inconvenience than a show-stopper.

From an operating system perspective, I do feel that Windows is lacking in its command line shell when compared to its Unix based counterparts. The ease of installing server based software for the work that I do (Apache, PHP, MySQL, SSL) on some parts are very easy but for more advanced setups such as SSL with Apache on a Windows machine is not as easy as it is on Linux. The pros of Windows I believe is the vast availability of software, the lack of being tied to specific hardware (such as with Apple) and the ease of use is very straightforward (especially when compared to Linux), as well as how easy it is to add/remove software without it being provided through repositories specifically made for the OS (such as with Apple and Linux).

As far as the network goes, when dealing with data streaming or copying large amounts of data or just working with a network drive (especially when using SVN across a repository with 1000s of files), Wifi falls very short of what I personally would accept as efficient performance. Currently we are running 100MBIT LAN to help alleviate this but then there is another bottleneck of our network server being quite old, but that does not really matter in this specific situation.

Due to working with multiple programmers, the disadvantage of the slowed network drive performance does outweigh each developer developing locally and doing an SVN commit with each code change to check on the  test server (our network drives are located on the test server), this allows us to edit directly on the test server and see our changes instantly without committing and updating the test servers code base. Although the time it takes to commit all our changes to live from our network drive is far longer than it should be. I would say the positive is still outweighing the negative.

At the moment our internal office network is secured by a basic ADSL router, our Wifi by the built in router wireless security and all of our work done on the test and live servers are done over SSH. I do believe that for the purposes of our organisation the security is sufficient. While one could look at it as the “standard” level of security, we do not deal with vital user information in our local network. Each computer is equipped with software antivirus and firewalls that update daily. I personally believe that overdoing the security, while it does lower your risk, can be overkill if you are not dealing with mission critical data (such as banking information).

If I were to modify just one element I would designate more responsibility to our test server as at the moment it is serving as the DHCP server while the DNS and routing/gateway is the router. To centralise the DNS being handled on the test server as well it would alleviate us needing to alter our hosts file when working remotely to point to the test servers external IP instead of its internal LAN IP. If DNS was handled by the test server the IP address designation could be altered at the source instead of each user computer.