DATA CENTRE UNDERWATER AND OUT OF THE WAY.

elite computer

Underwater data centres make a lot of sense. Half of the world’ population lives a few hours travel from an ocean. If land real estate prices are high then we remember that the ocean floor has little competition for space. As long as servicing is not an issue, which is a question still under investigation, underwater data centres should prove useful.

Microsoft has unveiled Project Natick, an underwater data centre on trial for 4 months. This first centre has been positioned about a kilometre off the West Coast of the USA. As with other projected underwater centres it should provide quick and inexpensive data access for those using the cloud. As more and more computers make use of the cloud for data storage there will be a need for far more data centres.

The initial Natick project looks promising, but is still a smaller sized experiment. If it proves viable the future commercial data centres will have twenty times the storage capacity. The target lifespan of each centre should be about twenty years, but they will be retrieved and upgraded every five years.

One of the major issues with computers and data centres has always been cooling. Power to the machines causes heating that will quickly prove destructive if not compensated for. But underwater data centres should sat naturally cool because of the environment, removing a problem common in other data centre situations. The other major issue is supplying that power. But in accordance with environmental principles these data centres are intended to be self-sufficient, probably using wave and tidal power.  The environment at 30 feet below the water surface, where data centre Natick present is, looks to be quite stable; temperatures should stay consistently cool and tidal forces should be regular.

In the near future Microsoft hopes to have underwater data storage facilities installed and operational within 90 days of request.

ELITE COMPUTERS – A Few Windows 10 Features

Computer Repair and Updating

It’s easy to update from windows 8 to windows, 10; it’s even free. But it’s not so easy to get use to the new operating system. Yet windows 10 is designed to be a longer term solution; if we are going to get accustomed to an operating system this looks to be the better bet.

Clear hard Drive: this was once a third party app. Now you can go to the start menu ‘storage’ section of the settings app. This will show you the files and folders on a drive and how much space they occupy. There are more powerful tools if you want to pay for third party programmes, but this inbuilt one is quite good.

Uninstall Apps: It used to be a nuisance if you wanted to uninstall an application, unless you paid for a third party programme.  Now you can right click any app on the start menu and select uninstall. If you can’t find the app you want to uninstall, just click all Apps at the bottom of the start menu.

Pin any app to the start menu. Most people know about this, but it’s a good idea for anything you use on a regular basis. It’s also good to have the old start menu back.

Keyboard shortcuts; There are more of these than ever. Including:

Win+Q or Win+S: Open Cortana (personal assistant)

Win+I: Open Windows 10 Settings.

Win+A: Open the notification centre.

Win+Ctrl+D: Create a new virtual desktop.

Win+Ctrl+F4: Close the current virtual desktop.

Privacy

There is more than a little written about the privacy issues on windows 10. If you are concerned, simply change the settings.

To minimize what is sent to Microsoft.

Settings > Privacy.

Scroll to Feedback & diagnostics.

            Under Diagnostics and Usage Data select Basic.

Now only minimal information is sent to Microsoft; hopefully nothing compromising.

Fingerprint recognition capabilities. Your machine may or may not have the hardware for this, but the operating system is ready for it. This might be the end of passwords.

Computer shop

One of the biggest regrets we have with fast moving technology is the missed opportunities. Individuals update their synthesizers, phones and computers knowing there was untapped potential in the devices that never got used. We can avoid this, at least to some extent, by familiarizing ourselves with the systems; it also helps to search the internet for other people’s experience and recommendations. We will never know every detail of our machines, but a little extra knowledge will be useful on a regular basis.

What is Quantum Computing

Early October 2015, and researchers at the University of New South Wales have built a logic gate that looks to make quantum computers feasible. But what is a quantum computer?

We know a classic computer uses bits of information, either ones or zeros (on/off, true/false, depending how you notate them). These bits are process by a variety of logic gates; logic gates take two bits of information and give a one bit response; An AND type logic gate takes two bits of information and outputs a ‘one’ (on, true …etc.) if both inputs are one, and outputs a zero under any other circumstance. And OR logic gate takes two bit and out a ‘one’ if either input bit is ‘one’. There are also NAND, NOR, XOR, NOT gates. All classic computing uses complex combinations of these logic functions.

In the past bits in classic computers have consisted of voltages between zero and a set higher value, usually 5 or 10 volts. A zero bit was represented by a zero (or near zero) voltage; a one was represented by a higher voltage. With enough bits a complex pattern of ones and zeros could be represented. With 2 bits there were 4 representations; with 3 there were 8 representations. It there were N bits there were 2 to the Nth combinations.

Quantum computers are different from classic computers in several ways. One difference is that where a classic computer has exactly one state amongst 2 to the Nth possibilities the quantum computer simultaneously has many states, up to 2 to the Nth power. The initial state of the quantum qubits (quantum bits) represents a piece of data; these are applied to a series of gates know as a quantum algorithm, and the resulting state (one of 2 to the Nth possibilities) is the outcome. Because of the non-deterministic nature of the quantum algorithm this outcome is only considered correct within a certain probability.

The new development at the UNSW is the invention of a logic gate that works with qubits. This has basically been the last major hurdle in building a quantum computer; all the necessary building blocks for operational quantum computers are now here.

The benefits of quantum computers are numerous. Their massively increased speed is one factor. Massively increased speed means programs can now be much more complex and still produce results in a reasonable amount of time. Programs to predict the weather once took several days to run, by which time the weather conditions had already come and gone. Modern computers caught up with this at least a generation ago, but faster computers would allow more complex programs and quicker calculation results. Computer models for the human body, very useful for medical research, are now quite feasible and potential quite complex. Other applications and models are numerous. And because quantum computers use tiny particles such as photons for their qubits, and equally tiny logic gates, their size is much smaller than any present technology.

Now that we have Window 10

Apparently there are 75 million devices running windows 10, which is quit something when you remember that it was only officially released a month ago. Xbox gamers alone have clocked up 122 years’ worth of play in this time; I’m not sure if that’s impressive or something to worry about. On a lighter side the Cortana, the digital assistant, has told more than half a million jokes to anybody who asked. Some of this comes from individual’s trying to ask the most bizarre questions they can think of; but it turns out that if you want a legitimate joke you just have to ask for one.

On the downside Chrome browsers are having some compatibility problems with windows 10. Apparently they aren’t too extreme, but years of trouble free chrome use it is frustrating to have to deal with slow response times and erratically streaming video. Chrome is apparently working on ironing it out, but simply reinstalling the browser might help. The fact that people still prefer chrome with its windows 10 teething problems over other browsers say a lot about customer loyalty and Chrome’s generally high ranking performance.

Of course the other browser option is the new edge browser, supplied in windows 10. Apparently far better that Internet explorer (which always had a mediocre reputation) the only ostensive downside to the simpler looking edge is the lack of a few favourite features form the past. But wait, apparently the Settings on the menuhas an Import favourites form another browser option. If you choose your old browser and click import you look like you might be in luck.

Wi-Fi still caused problems in windows 10, though this is no different to earlier windows systems. The situation can be improved by disabling Wi-Fi sharing; else, you will have to reboot.

Actually, disabling Wi-Fi sharing is one of the things we were advised to do the moment we had windows 10 up and running, the others were to customise the start menu, choose ‘notify to schedule restart’ in ‘advance windows update options, and check out the action centre. Action centre given you all your notification in one place, which everybody seems to like and prefer to the tile system on windows 8.

Windows 95 really was about 20 years ago today, and that was based on MS-DOS, something dating back to 1981. But those looking for a similar layout , a familiar user interface, will see that some things have stayed consistent up till windows 10, even if they had to be changed back again after the less popular windows 8. One feature we might take for granted is the start-up sound; for whatever reason it became iconic. 20 years ago the Windows 95 still had MS-DOS under its new User Interface. I can’t vouch for how much is left of this under windows 10, but a familiar UI suggests it has either endured, or was significantly influential in developing what we presently have.

Computer Shop Strathfield

Can you imagine a life without computers? I actually know more people who live without a personal car than a personal computer. Of course they use public transport, but we all have public computers in our life too. Banking, library loans, most retailing, even the public transport systems have used computers for decades.

Schools have had some computer integration for more than a generation. Children adapt to them quite easily, showing that computer use is not so much hard as just different to what we were formally use to. Like a foreign language it is only difficult when it is foreign to you.

Buying a computer is both simple and tricky. A decent computer is now cheaper than ever; you can buy a HP for a few hundred dollars, faster and with more Memory that dreamed of a decade ago. The slightly more difficult part is buying something that suits your needs. But even this is no so much the computer as the software you run upon it. As long as you have a current operating system you can use the latest software; and even the operating system can be updated as long as your machine is not too old.

Refurbished computers are a tempting proposition. But the quality varies. A second-hand computer sold straight from the previous owned (EBay, gumtree …etc.) may or may not be good. Avoid these unless you know enough about the computer to assess the quality yourself. Many second-hand computers are simply restored to factory pre-sets, without any maintenance. Problems can quickly make your purchase a regretful one.

Factory refurbished computers are a much safer bet, and often have some guarantee. They have been completely overhauled and inspected by a qualified technician, and will operate like new. Their only shortcoming is that they use the technical standards from when they were current technology; they are the product of the time they were built. Of course this is often not a cause for concern. If you just want to use a word processor and surf the net an older computer might be more than enough. And it will probably run a more current operating system than the one it originally came with. Many people use older computers for anything from music recording to photo-shopping. Free from viruses these computers can happily run for a decade or two.

One advantage with a refurbished computer is finding reviews by others with the same model. New computers are more advanced, but unproven in the field. Trying Goggling a computer’s model number and see if it has overheating issues, or lacks a certain vital feature, or has any other peculiarities. A little research can show if a certain model is reliable and appropriate for your situation.

If you are part of the world today you are interacting with computers. Enrich your life with a computer system of your own. Visit the Elite computer shop, Strathfield.

PRE-PACKAGED CLOUD

One of the big decisions about going to the Cloud is whether to use an in-house or an external system. Those who prefer the in-house approach have the unenviable task of setting up a system themselves, or hiring some external group to set it up. In the past this has proved time consuming and entailed many detailed decisions. Inevitably somebody had to think of a better way.
The Hewlett –Packard Helion Rack is designed help a company set up a private, in-house cloud much faster. HP creates each system and sets it up at the business site. As these systems use the same Openstack for running infrastructure services as HP’s own public cloud, so the package has the benefit of both experience and years of practical application.
Mid-sized businesses looking to deploy their first OpenStack cloud system will probably be the main target for these rack systems; sizable departments within larger corporations may also be interested. The HP Helion rack system is well suited for developing new cloud type applications, and is designed for computationally heavy workloads.
One of the selling points of the Cloud has always been the convenience and ease-of-use (along with expandability and cost). But this ease-of-use is at the user’s end, not the party creating the system. Like complex search algorithms or guidelines for creating web content any creator of a computer system is forced to go to great lengths to create something that works to the user’s benefit; a complex and flexible system that interfaces easily with the complexities of the human mind and its requirements. A pre-packaged Cloud-in-a-rack may well pass this convenience and ease-of-use to the purchasing company. The hard work is already done by the designers, so the new owner need only specify how they want the system customised to their needs.
Individuals familiar with HP’s Public cloud systems may have an advantage here as this quick to install Private system runs in a near identical way. Of course it will be extremely secure, and customisable.
The HP Helion rack will be 42 standard rack units in size (a little under 188 cm) and is designed to easily accommodate added storage facilities and other external devices. It should be available in May 2015, with prices varying with the system configuration. The lowest price system should comfortably support 400 virtual machines.

CLOUD FAQ’S

Cloud – “A model for enabling convenient, on-demand network access to a shared pool of configurable computing resources (for example, networks, servers, storage, applications and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction”
US National Institute of Standards and Technology

Cloud is not so much a technology as an approach to using technology. It provides massive IT related facilities through the internet, allowing customers to expand and reduce their storage space, data requirements and services as needed.
Public Cloud is a service that anybody (with internet and means of payment) can access. They are a shared infrastructure on a pay-as-you-use system.
Private Cloud uses a similar delivery mode to public cloud, but only for a select group of users, like the staff of a company. It can be considered much safer as the private cloud is behind a firewall and the companies own security systems.
Hybrid cloud (public and private that are linked) also exist, as do community clouds, a system shared by several group but not the public in general.

What are Cloud prices like?
You can store data on the Cloud with fast access time for about 25 cents per gigabyte for one month, or as an archived file for about 1 cent per month, archived files have slower access, but sometime only by a few seconds. You pay to upload and download information to those files.
Services on the Cloud can cost anything from 10 cents per hour to more than a dollar per hour.
How is software licencing affected by the Cloud?
This is a good question, whose answer is still in the process of being worked out. Previously, before the Cloud, a customer paid retail price for the software, even if they hardly used its potential capabilities. Cloud is potentially more flexible, and should allow individual and companies to only pay for services as they are used, and pay less if they only use a small part of the service’s potential. At the moment this is still being worked out. The software and its licencing are less flexible than what the cloud is capable of.
What about downtime and access of data?
Many cloud servers guarantee between 99.9% and 99.95% uptime, calculated over a year. This means no more than 4 to 8 hours of downtime (offline with no data access) in a 12 month period. But fine print in contracts sometimes means more downtime can occur that the provider will not take responsibility for. Really, there are no complete guarantees. But as even non-Cloud computers and other internet services occasionally don’t work, so we cannot expect perfection.
Will my application perform differently on Cloud?
Quite probably, but it depends on which cloud provider you have. Find one that works for you and runs services that are appropriate for your company. You may have to ask a lot of questions here and do a lot of reading, but there are appropriate providers out there for you somewhere. It helps to think of the situation as offering a lot of different options, and to think of cloud as upgrading your business operation. As such, you don’t have to follow the same procedure as you had before; look for a Cloud provider that offers a new and improved situation.
If you have Infrastructure as a Service (IaaS) or Platform as a Service (PaaS) you may have a lot flexibility over what applications you wish to run. However, the method of charging for this may vary. With Software as a Service (SaaS) you might only pay for software/applications as they are used; with PaaS you probably have to pay the full software price upfront.

SOFTWARE DEFINED STORAGE

Software defined storage is a programmatic approach to setting up and using storage. It allows you to use storage and expand by rational prediction of your needs. It is not a product, it is an approach made possible through connected hardware and software products. The details are not particularity important, but the connections between parts are quite loose.
Storage space is an issue in some ways. But as storage space becomes increasingly cheaper people tend to think the problems will solve themselves. There is some truth to this. Data does increase exponentially, but hardware manufacturers know this and are quick to produce larger capacity storage media because they know there is a growing market, and a profitable one at that. It is to their credit that the prices decrease as the capacity and performances increase.
If we’re on the Cloud we tend to think this is a pseudo problem; cloud is supposed to expand as the requirements increase. Well yes, if it has the software defined storage approach. Your data will accumulate, and if it’s anything like the past the increase will not be linear. We use to deal with megabytes. Now we buy Terabytes drives. The software defined storage approach should track this so we need never trash old customer files and always keep every email we ever received. We can be sure this will not level out in the future, but continue to expand even if your company stays at its present size forever (we actually hope your company does improve). The elastic approach of software defined storage predicts how this will expand so we need only buy the storage space we need. Yes, cloud can expand to accommodate our needs, that’s part of the appeal. But we need to know what are needs are, that’s part of the problem it wishes to address.

REAL WORLD CONNECTIONS

Arthur C. Clarke’s novel The City and the Stars voiced two different possibilities for a future society. One was a city run by computers, totally detached from the physical outside world, forming its own microcosm. The other was a society living in harmony with the physical world, but augmenting it with technology. A generation ago there was great fear that our future would be a sterile world run by computers, like the detached microcosm city, and we would lose something human. But modern developments, concerns for the environment, human engineering …etc. have tended to lean towards the other option – humans developing technology that takes its cues from the real world.
Interfacing technology via the cloud is an example of this. Car manufacture Volvo envisions a system where information from a car gets distributed via a connection on the cloud. If your car encounters slippery road conditions, traffic, or anything else of concern to motorists the car’s cloud connection passes the information to others nearby who might be using the same roads.
This is not really so much a new concept as the development of an old one. We’ve had traffic reports for years, and we have apps on smartphones where individuals can look up road and weather information. But this approach is more integrated; the information goes straight to the car and driver who doesn’t have to use the phone while driving, or listen to a possibly relevant radio report. It means navigation systems on cars can give the most economical route to a destination at a particular point in time rather than a route that would be best under ideal conditions. This information to the car is current, integrated with the car’s systems, and has far greater detail than before.
Or course it is only as good as the people who use it. If there are only a few cars and drivers with the system then they have to hope some other driver has already gone the route ahead of them if they are to gather any useful information. Whereas if the majority of people have the cloud accessing systems there will always be relevant information as long as there is some traffic on the road. Any driver can benefit from the experience of another driver, even if the experience was only moments before.
Undoubtedly this could have an impact on insurance. Hopefully it will prevent a few incidents, but even if it doesn’t stop all problems it might help if we know that the drivers were at least complying with up-to-date information and following a recommended path.
Technology with this approach adapts to the world, which is also a world that we have adapted to us, having built cities, road and other technology. Information that these systems convey is far more extensive than before, but it is never complete or final, as the outside world is always changing. Any concern we once had of being isolated in our own stagnant world now seem unfounded. In an infinitely complex and changing reality there will always be constant change in how we adapt to it.

FAST AND SLOW PRICING OPTION CLOUD

We get the impression that going to the Cloud means signing up and getting the payment and login details. Of course this is false. We have no end of articles telling us about teething problems, or being better off with hybrid or private or public cloud or anything other than what we had before. But there are still more things to consider, and that means wading through a lot of figures and making a few worrying decisions. And actual experience might change a few decisions and opinions too.
The speed will vary according to which cloud server that you use, both the company and the options within that company. Bigger machines can be slower, and the reasons for this vary. This is complicated further by the fact that different test programs can run quickly in some situations but less quickly in others, and not run at all in the case of some servers who aren’t set for that particular software.
It would make sense if the more expensive services were faster, or had some other advantage; but this is not always the case. More CPU’s will be faster, all else being equal, but while 8 CPU’s will be faster than One they are unlikely to be 8 times as fast. Windows Azure machines were more than twice as fast when they used the 8 CPU’s option instead of one, but that doubling of speed came at 8 times the cost.
If you increase the number of CPU’s but keep the same amount of RAM you might save some cost. But this can affect the performance in unpredictable ways. Sometimes a 2 CPU Machine is faster than a single CPU machine with the same RAM, but not by a large amount (perhaps 30%). Sometime the 2 CPU is actually slower.
Google Cloud seems to have a fair way of tallying their performance with price, so that their expensive options do have significantly better performance. How that compares to the options offered by another company is another matter.
To make matters more confusing the performance varies over time even with the same system and provider. If you’re using Cloud you are sharing resources with other users, because that’s what Cloud is about. If a lot of computing power is being used by a lot of other groups, then you don’t get any special consideration; things will be slower. Occasionally this works to your advantage and you get bursts of high speed interaction, but only at off peak usage times. If you only want lots of computing power for short periods of time this might be fine. Constant usage, however, will probably mean fluctuating speeds.
Fluctuating Cloud speed might make a smaller provider more attractive, or a super large provider with greater resources; but you need a lot of CPU’s and RAM to achieve significantly higher speeds. Even then it’s had to know which option is best, as speed still varies with the application, and will fluctuate with user traffic. You can’t know these things till you (or somebody else) has tried the service, and even then it might degrade if they accumulate more clients, or improve when they decide to upgrade.

http://www.computerworld.com.au/article/539239/ultimate_cloud_speed_tests_amazon_vs_google_vs_windows_azure/?fp=4&fpid=51023