DR DECAL

Plastic Models have always been popular. They may have their origins in the handicrafts of our distant ancestors, combined with a little engineering skill of latter society. There is a certain satisfaction in building a scale model of an item, whether it is a child’s toy or an exact replica of an historical item. Until recently the options for building such scale models was limited to pre-packaged kits or the engineering skills of the individual creator. This has changed since the advent of 3D printers.

3D printers have been coming into the public’s hands for the past few years, a quiet revolution that started in high end hobby and electronic stores and spread into High School level science projects. As with all technology the equipment was initially expensive, but the printers soon became another moderate priced computer peripheral. As most households already had a computer some interested parties simply added a 3D printer, along with a regular printer and the net connection.

3D printing makes almost any plastic model project conceivable, though size and moving parts might be a limitation. But even then it is possible to print smaller items that combine into larger ones. A certain amount of pre-planning is required for this.

Plastic Model Decals

3D printing produces the right shaped object, right down to the fine detail. But the colour and surface patterns must then be added. This is where paint and decals come in.

Designing your own model decals is not too difficult, and probably much easier than the 3D printing process. Anybody who can make a design can have is converted into waterslide decals. Or that can buy waterslide decal paper and print their own scale model decals.

With 3D printing and printed decals there is no limit to the types of scale models produced. Model train enthusiasts can use it to create any historical variation upon a general train design, with any decal markings. At the other end of the creative spectrum people can design any Sci-Fi vehicle or demonstrate architectural designs. Decals add the final touch.

Mr-Gimble-small

Now that we have Window 10

Apparently there are 75 million devices running windows 10, which is quit something when you remember that it was only officially released a month ago. Xbox gamers alone have clocked up 122 years’ worth of play in this time; I’m not sure if that’s impressive or something to worry about. On a lighter side the Cortana, the digital assistant, has told more than half a million jokes to anybody who asked. Some of this comes from individual’s trying to ask the most bizarre questions they can think of; but it turns out that if you want a legitimate joke you just have to ask for one.

On the downside Chrome browsers are having some compatibility problems with windows 10. Apparently they aren’t too extreme, but years of trouble free chrome use it is frustrating to have to deal with slow response times and erratically streaming video. Chrome is apparently working on ironing it out, but simply reinstalling the browser might help. The fact that people still prefer chrome with its windows 10 teething problems over other browsers say a lot about customer loyalty and Chrome’s generally high ranking performance.

Of course the other browser option is the new edge browser, supplied in windows 10. Apparently far better that Internet explorer (which always had a mediocre reputation) the only ostensive downside to the simpler looking edge is the lack of a few favourite features form the past. But wait, apparently the Settings on the menuhas an Import favourites form another browser option. If you choose your old browser and click import you look like you might be in luck.

Wi-Fi still caused problems in windows 10, though this is no different to earlier windows systems. The situation can be improved by disabling Wi-Fi sharing; else, you will have to reboot.

Actually, disabling Wi-Fi sharing is one of the things we were advised to do the moment we had windows 10 up and running, the others were to customise the start menu, choose ‘notify to schedule restart’ in ‘advance windows update options, and check out the action centre. Action centre given you all your notification in one place, which everybody seems to like and prefer to the tile system on windows 8.

Windows 95 really was about 20 years ago today, and that was based on MS-DOS, something dating back to 1981. But those looking for a similar layout , a familiar user interface, will see that some things have stayed consistent up till windows 10, even if they had to be changed back again after the less popular windows 8. One feature we might take for granted is the start-up sound; for whatever reason it became iconic. 20 years ago the Windows 95 still had MS-DOS under its new User Interface. I can’t vouch for how much is left of this under windows 10, but a familiar UI suggests it has either endured, or was significantly influential in developing what we presently have.

Computer Shop Strathfield

Can you imagine a life without computers? I actually know more people who live without a personal car than a personal computer. Of course they use public transport, but we all have public computers in our life too. Banking, library loans, most retailing, even the public transport systems have used computers for decades.

Schools have had some computer integration for more than a generation. Children adapt to them quite easily, showing that computer use is not so much hard as just different to what we were formally use to. Like a foreign language it is only difficult when it is foreign to you.

Buying a computer is both simple and tricky. A decent computer is now cheaper than ever; you can buy a HP for a few hundred dollars, faster and with more Memory that dreamed of a decade ago. The slightly more difficult part is buying something that suits your needs. But even this is no so much the computer as the software you run upon it. As long as you have a current operating system you can use the latest software; and even the operating system can be updated as long as your machine is not too old.

Refurbished computers are a tempting proposition. But the quality varies. A second-hand computer sold straight from the previous owned (EBay, gumtree …etc.) may or may not be good. Avoid these unless you know enough about the computer to assess the quality yourself. Many second-hand computers are simply restored to factory pre-sets, without any maintenance. Problems can quickly make your purchase a regretful one.

Factory refurbished computers are a much safer bet, and often have some guarantee. They have been completely overhauled and inspected by a qualified technician, and will operate like new. Their only shortcoming is that they use the technical standards from when they were current technology; they are the product of the time they were built. Of course this is often not a cause for concern. If you just want to use a word processor and surf the net an older computer might be more than enough. And it will probably run a more current operating system than the one it originally came with. Many people use older computers for anything from music recording to photo-shopping. Free from viruses these computers can happily run for a decade or two.

One advantage with a refurbished computer is finding reviews by others with the same model. New computers are more advanced, but unproven in the field. Trying Goggling a computer’s model number and see if it has overheating issues, or lacks a certain vital feature, or has any other peculiarities. A little research can show if a certain model is reliable and appropriate for your situation.

If you are part of the world today you are interacting with computers. Enrich your life with a computer system of your own. Visit the Elite computer shop, Strathfield.

PRE-PACKAGED CLOUD

One of the big decisions about going to the Cloud is whether to use an in-house or an external system. Those who prefer the in-house approach have the unenviable task of setting up a system themselves, or hiring some external group to set it up. In the past this has proved time consuming and entailed many detailed decisions. Inevitably somebody had to think of a better way.
The Hewlett –Packard Helion Rack is designed help a company set up a private, in-house cloud much faster. HP creates each system and sets it up at the business site. As these systems use the same Openstack for running infrastructure services as HP’s own public cloud, so the package has the benefit of both experience and years of practical application.
Mid-sized businesses looking to deploy their first OpenStack cloud system will probably be the main target for these rack systems; sizable departments within larger corporations may also be interested. The HP Helion rack system is well suited for developing new cloud type applications, and is designed for computationally heavy workloads.
One of the selling points of the Cloud has always been the convenience and ease-of-use (along with expandability and cost). But this ease-of-use is at the user’s end, not the party creating the system. Like complex search algorithms or guidelines for creating web content any creator of a computer system is forced to go to great lengths to create something that works to the user’s benefit; a complex and flexible system that interfaces easily with the complexities of the human mind and its requirements. A pre-packaged Cloud-in-a-rack may well pass this convenience and ease-of-use to the purchasing company. The hard work is already done by the designers, so the new owner need only specify how they want the system customised to their needs.
Individuals familiar with HP’s Public cloud systems may have an advantage here as this quick to install Private system runs in a near identical way. Of course it will be extremely secure, and customisable.
The HP Helion rack will be 42 standard rack units in size (a little under 188 cm) and is designed to easily accommodate added storage facilities and other external devices. It should be available in May 2015, with prices varying with the system configuration. The lowest price system should comfortably support 400 virtual machines.

CLOUD FAQ’S

Cloud – “A model for enabling convenient, on-demand network access to a shared pool of configurable computing resources (for example, networks, servers, storage, applications and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction”
US National Institute of Standards and Technology

Cloud is not so much a technology as an approach to using technology. It provides massive IT related facilities through the internet, allowing customers to expand and reduce their storage space, data requirements and services as needed.
Public Cloud is a service that anybody (with internet and means of payment) can access. They are a shared infrastructure on a pay-as-you-use system.
Private Cloud uses a similar delivery mode to public cloud, but only for a select group of users, like the staff of a company. It can be considered much safer as the private cloud is behind a firewall and the companies own security systems.
Hybrid cloud (public and private that are linked) also exist, as do community clouds, a system shared by several group but not the public in general.

What are Cloud prices like?
You can store data on the Cloud with fast access time for about 25 cents per gigabyte for one month, or as an archived file for about 1 cent per month, archived files have slower access, but sometime only by a few seconds. You pay to upload and download information to those files.
Services on the Cloud can cost anything from 10 cents per hour to more than a dollar per hour.
How is software licencing affected by the Cloud?
This is a good question, whose answer is still in the process of being worked out. Previously, before the Cloud, a customer paid retail price for the software, even if they hardly used its potential capabilities. Cloud is potentially more flexible, and should allow individual and companies to only pay for services as they are used, and pay less if they only use a small part of the service’s potential. At the moment this is still being worked out. The software and its licencing are less flexible than what the cloud is capable of.
What about downtime and access of data?
Many cloud servers guarantee between 99.9% and 99.95% uptime, calculated over a year. This means no more than 4 to 8 hours of downtime (offline with no data access) in a 12 month period. But fine print in contracts sometimes means more downtime can occur that the provider will not take responsibility for. Really, there are no complete guarantees. But as even non-Cloud computers and other internet services occasionally don’t work, so we cannot expect perfection.
Will my application perform differently on Cloud?
Quite probably, but it depends on which cloud provider you have. Find one that works for you and runs services that are appropriate for your company. You may have to ask a lot of questions here and do a lot of reading, but there are appropriate providers out there for you somewhere. It helps to think of the situation as offering a lot of different options, and to think of cloud as upgrading your business operation. As such, you don’t have to follow the same procedure as you had before; look for a Cloud provider that offers a new and improved situation.
If you have Infrastructure as a Service (IaaS) or Platform as a Service (PaaS) you may have a lot flexibility over what applications you wish to run. However, the method of charging for this may vary. With Software as a Service (SaaS) you might only pay for software/applications as they are used; with PaaS you probably have to pay the full software price upfront.

SOFTWARE DEFINED STORAGE

Software defined storage is a programmatic approach to setting up and using storage. It allows you to use storage and expand by rational prediction of your needs. It is not a product, it is an approach made possible through connected hardware and software products. The details are not particularity important, but the connections between parts are quite loose.
Storage space is an issue in some ways. But as storage space becomes increasingly cheaper people tend to think the problems will solve themselves. There is some truth to this. Data does increase exponentially, but hardware manufacturers know this and are quick to produce larger capacity storage media because they know there is a growing market, and a profitable one at that. It is to their credit that the prices decrease as the capacity and performances increase.
If we’re on the Cloud we tend to think this is a pseudo problem; cloud is supposed to expand as the requirements increase. Well yes, if it has the software defined storage approach. Your data will accumulate, and if it’s anything like the past the increase will not be linear. We use to deal with megabytes. Now we buy Terabytes drives. The software defined storage approach should track this so we need never trash old customer files and always keep every email we ever received. We can be sure this will not level out in the future, but continue to expand even if your company stays at its present size forever (we actually hope your company does improve). The elastic approach of software defined storage predicts how this will expand so we need only buy the storage space we need. Yes, cloud can expand to accommodate our needs, that’s part of the appeal. But we need to know what are needs are, that’s part of the problem it wishes to address.

REAL WORLD CONNECTIONS

Arthur C. Clarke’s novel The City and the Stars voiced two different possibilities for a future society. One was a city run by computers, totally detached from the physical outside world, forming its own microcosm. The other was a society living in harmony with the physical world, but augmenting it with technology. A generation ago there was great fear that our future would be a sterile world run by computers, like the detached microcosm city, and we would lose something human. But modern developments, concerns for the environment, human engineering …etc. have tended to lean towards the other option – humans developing technology that takes its cues from the real world.
Interfacing technology via the cloud is an example of this. Car manufacture Volvo envisions a system where information from a car gets distributed via a connection on the cloud. If your car encounters slippery road conditions, traffic, or anything else of concern to motorists the car’s cloud connection passes the information to others nearby who might be using the same roads.
This is not really so much a new concept as the development of an old one. We’ve had traffic reports for years, and we have apps on smartphones where individuals can look up road and weather information. But this approach is more integrated; the information goes straight to the car and driver who doesn’t have to use the phone while driving, or listen to a possibly relevant radio report. It means navigation systems on cars can give the most economical route to a destination at a particular point in time rather than a route that would be best under ideal conditions. This information to the car is current, integrated with the car’s systems, and has far greater detail than before.
Or course it is only as good as the people who use it. If there are only a few cars and drivers with the system then they have to hope some other driver has already gone the route ahead of them if they are to gather any useful information. Whereas if the majority of people have the cloud accessing systems there will always be relevant information as long as there is some traffic on the road. Any driver can benefit from the experience of another driver, even if the experience was only moments before.
Undoubtedly this could have an impact on insurance. Hopefully it will prevent a few incidents, but even if it doesn’t stop all problems it might help if we know that the drivers were at least complying with up-to-date information and following a recommended path.
Technology with this approach adapts to the world, which is also a world that we have adapted to us, having built cities, road and other technology. Information that these systems convey is far more extensive than before, but it is never complete or final, as the outside world is always changing. Any concern we once had of being isolated in our own stagnant world now seem unfounded. In an infinitely complex and changing reality there will always be constant change in how we adapt to it.

FAST AND SLOW PRICING OPTION CLOUD

We get the impression that going to the Cloud means signing up and getting the payment and login details. Of course this is false. We have no end of articles telling us about teething problems, or being better off with hybrid or private or public cloud or anything other than what we had before. But there are still more things to consider, and that means wading through a lot of figures and making a few worrying decisions. And actual experience might change a few decisions and opinions too.
The speed will vary according to which cloud server that you use, both the company and the options within that company. Bigger machines can be slower, and the reasons for this vary. This is complicated further by the fact that different test programs can run quickly in some situations but less quickly in others, and not run at all in the case of some servers who aren’t set for that particular software.
It would make sense if the more expensive services were faster, or had some other advantage; but this is not always the case. More CPU’s will be faster, all else being equal, but while 8 CPU’s will be faster than One they are unlikely to be 8 times as fast. Windows Azure machines were more than twice as fast when they used the 8 CPU’s option instead of one, but that doubling of speed came at 8 times the cost.
If you increase the number of CPU’s but keep the same amount of RAM you might save some cost. But this can affect the performance in unpredictable ways. Sometimes a 2 CPU Machine is faster than a single CPU machine with the same RAM, but not by a large amount (perhaps 30%). Sometime the 2 CPU is actually slower.
Google Cloud seems to have a fair way of tallying their performance with price, so that their expensive options do have significantly better performance. How that compares to the options offered by another company is another matter.
To make matters more confusing the performance varies over time even with the same system and provider. If you’re using Cloud you are sharing resources with other users, because that’s what Cloud is about. If a lot of computing power is being used by a lot of other groups, then you don’t get any special consideration; things will be slower. Occasionally this works to your advantage and you get bursts of high speed interaction, but only at off peak usage times. If you only want lots of computing power for short periods of time this might be fine. Constant usage, however, will probably mean fluctuating speeds.
Fluctuating Cloud speed might make a smaller provider more attractive, or a super large provider with greater resources; but you need a lot of CPU’s and RAM to achieve significantly higher speeds. Even then it’s had to know which option is best, as speed still varies with the application, and will fluctuate with user traffic. You can’t know these things till you (or somebody else) has tried the service, and even then it might degrade if they accumulate more clients, or improve when they decide to upgrade.

http://www.computerworld.com.au/article/539239/ultimate_cloud_speed_tests_amazon_vs_google_vs_windows_azure/?fp=4&fpid=51023

CLOUD TRENDS IN EARLY 2015

– Security will always be a concern. But this is most noticeable in a time of transition, specifically a business’s transition to the public cloud. A mixture on new and established approaches will be used and modified/developed for the cloud environment. There will always be hackers and other threats, but the cloud will end up at least as safe as previous computing models.
– Hybrid cloud, the combining of public and private, will be common. Hybrid cloud is not a set template, so each company will have their own combination that integrates the two forms. The solution provider will need the skills to do this, but there should be plenty of work for those who are capable.
– Mobile apps already allow employees to work from anywhere. Having half a dozen enterprise apps or more on your smartphone will soon be quite regular. Offices will become further decentralized as employees work from wherever they are.
– The phone and IT systems may merge; mergers and acquisitions between companies may be common, and hopefully mean well integrated services.
– The internet of things has been imminent for a while. If it seems to progressing slowly it may be because of the huge range of items and services that may come about. And the fact that the changing computer landscape means nothing can be finalizes till standards are set. Watches and Google glasses have been selling for a while. Other wearables such as clothing with sensors also exist. Expect home appliances, furniture, healthcare, manufacturing and so much else to be net connected.
– Many tends followed or predicted the past have proved false, often comically so. We tend to notice either the true patterns or the laughably false ones at any point in time, and draw attention to one or the other. Really, some patterns prove accurate, some do not. The fact that the false predictions are permanently recorded on the net proves embarrassing; the true predictions are also recorded, but just seem obvious in hindsight.

DATA LOSS

Data integrity has to be an important issue for everybody. Sure, some things may be more important than others, but does anybody bother to store data that isn’t of some importance? The matter is important for the reputation of the cloud providers. Even if the things they lost last time were somehow unimportant, wouldn’t we take that as a warning not to trust them next time?
Cloud providers tend to balance cost effectiveness of services with the quality of the services; it’s the same for a lot of business situations. There is a general trend for the client’s level of protection to vary with the type of service. IaaS (Infrastructure as a service) provides a means of creating a cloud environment, but data backup may not be included. PaaS (Platform as a Service) will have more selling points, usually including data protection. But this varies greatly between providers. Some IaaS providers have good solid data protection options, though you will pay a little more for this security.
There is data on the Cloud that is part of the provider’s operations that does not affect the customer to any real degree. Extreme loss here might jeopardise the provider’s business, but really their ability to protect this data is only a concern to us in as much as is indicates their general level of security.
Of more concern is data loss that affects either the provider and the client, or only the client. The first category would include environment matters, Configurations, virtual networking, Provisioning management …etc. This is not the data provided by the client but the packages it is stored in; it’s like losing your word processor program. The Second category is the client’s information, the data they put on the cloud. This is like the words typed into the word processor, but not the program itself. Loss of this is what we are most concerned with.
Really it is the customer who is most responsible for this data. Providers vary in what they offer, so the client has to choose the best options available. There was certainly data loss before the Cloud, and we took measures against it then. We need to be equally active now, or at least use a provider who is active for us. Some measures include:
Disk Level data Protection: An old but effective practice.
Constant backup: Periodically backing up data to a lower cost medium. Somebody has to decide how often to do this, and understand that recent updates will be lost, but otherwise this is a tried and true system. Memory of several Terabytes is quite cheap these days.
Data replication: Another older idea that stays in use because it works well. Software sends all the data to two different storage mediums. But check the ability to retrieve the data from secondary resources.
Journaled/checkpoint based replication.

The cloud infrastructures and provider services are important to look at, but we must remember that the main reason for downtime remains human error. One human error is simply choosing something that isn’t the best suited option.

References
http://www.cloudcomputing-news.net/news/2015/jan/15/how-cloud-providers-can-prevent-data-loss-guide/