Not just the shield, but we will help you Oversite your IT department.

We are helping simplify security, helping you oversee your security exposure, security profile.

There are a lot of options to discuss, we can help you oversee ( firewall configuration, network security, anti-virus setup.

This is a dangerous time as a business - trying to provide services for your customers while controlling security exposures to a risk level that is not dangerous.

Check our new website:

Blog posts start below

Bypass and IBM Proventia GX5008 --- Test environment Cloud servers -- infrastructure Saint Louis, production environment DB Cloud servers - Saint Louis , production Computer infrastructure - Saint Louis , production

The IBM Proventia GX5008 systems plus bypass at a previous project in the testing enviroment - a great solution for automatically removing malware out of network traffic (leftmost image).

The other pictures are scanned from past cloud projects in the early 2000 - 2003 time frame.

2014 Blog post: yes with the new year we are creating a new site:

Web Blog Posts

Just fixed a computer with the Harbinger 'rootkit' virus.

Typical tools used to clean the computer, plus the Kaspersky bootkit removal tool called TDSSKiller Kaspersky webpage link.

If you do decide to download from other sources (when doing a Google search on TDSSKiller sometimes one get 3rd party mirrors, then download from CNET webpage

I tried downloading from some other sources, like It was interesting how it operated, it created audio soundtracks when a browser opened. The audio was from various sound files on the computer and from ads on the Internet. The virus ran in the background (not obvious it was running, but you could hear it). The rootkit likely ran and was doing its masters bidding.

The sound was odd advertisements, shows local to the area and other random noises. It was very odd, and of course annoying. After the Kaspersky tool ran (in safemode) then the system was clean.

Also remember to patch your machines, there are a lot of Microsoft vulnerabilities coming out soon.

Here is the Microsoft Techcenter Security Bulletin list: Microsoft webpage


From 2013 - posts: Gartner Magic Quadrant 2013
Gartner has a method for weighing the managed hosting companies -- also considered
"cloud computing".
The report shows that IaaS is what the companies were mostly evaluated about.
IaaS - Infrastructure as a Service

Cloud company categorization at this link:
Cloud company categorization

The Talkin Cloud 100 also categorizes cloud companies:
Here is the link to TalkinCloud100

One of the most important Penetration Scan tools is:

Nmap Port Scanning Technique

Using several scans one can see if a system is vulnerable or not.

(more to come on this topic)


Richard Stallman is not impressed by cloud computing

Here is Richard Stallman's specific quote: "The concept of using web-based programs like Google's Gmail is "worse than stupidity", according to a leading advocate of free software."

He is not impressed with proprietary systems, which means that Gmail and Gdrive and more he will not like either.

This is not a surprise, and one should always be careful where and how to place your data. A proprietary system has advantages and drawbacks.

Richard Stallman created the free software movement (open source). He is referenced at the GNU site

Here is relevant information from the Free Software Foundation page:

In fact, such a movement exists, and you can be part of it. The free software movement was started in 1983 by computer scientist Richard M. Stallman, when he launched a project called GNU, which stands for “GNU is Not UNIX”, to provide a replacement for the UNIX operating system—a replacement that would respect the freedoms of those using it. Then in 1985, Stallman started the Free Software Foundation, a nonprofit with the mission of advocating and educating on behalf of computer users around the world.


Netcat is considered the swiss army knife of network utilities.

Why is that? Here is the first sentence and summarizes what Netcat does very well (from Sourceforge website)

"Netcat is a featured networking utility which reads and writes data across network connections, using the TCP/IP protocol."

You can imagine if you knew how to use a utility like that, it would be useful for network troubleshooting, including testing for network holes in security concerned areas.


Use Metasploit to test your networks using penetration techniques learned in one of my classes at SEC030 CAIT - Center for the Application of Information Technology.

One uses metasploit to try and exploit a system after doing an initial scan usually done by nmap or superscan.

Metasploit is not the end all program to penetrate a network, as a hacker will use his wiles to make headway into a network or try to accumulate more information for an eventual connection into a network or system.

Especially in cloud computing one needs to think outside the box as standard penetration techniques may not be useful to see if someone will penetrate into your systems and data. Metasploit is just one possible tool in a hackers toolset.


Interesting Yahoo news story about the Onion News getting hacked by the Syrian Electronic army.

Looking into this one sees that the phishing emails that were targeted to Onion employees affected and were successful only with 2 or 3 employees. Which then allowed the hackers to infiltrate more employees with additional phishing emails.

Using this information the hackers were able to access an account that controlled Twitter, which then allowed them to make faulty tweets.

This is an intereting exploit example, which shows that when it comes to social engineering one must be careful of the phishing emails.

Your identity and resources are waiting to be exploited by hackers for their nefarious ends.


IEEE cloud computing A Cloud computing Standards web page

Openstack is not the only organization trying to put forward Interoperability profiles

Of course sometimes the marketplace also creates standards by sheer momentum.

The reason of this importance is that for a company to look at computing resources in the cloud, it makes sense to look for services which do not create vendor lock-in. And with this industry still in its infancy it is important to review all standards.

05/09/2013 Website

Has a discussion of a warning about an analysis about a new toolkit that makes it easier to create malware or other attacks on the Windows platform, Java, and Adobe.

This is the actual link

AVG ThreatLabs has also discussed the Cool Exploit Kit.

What does this mean? - Be extra careful of links and attachments, as new malware is being developed.


Internet Storm Center Post about the Internet Explorer zero day exploit which has now taken so long to fix that Metasploit has an update to make it easy to hack machines

I.e. even the script kiddies (or people who only know enough to run basic scripts) can hack into machines now.

Here is also the Snort VRT update for the zeroDay vulnerability.

When a new vulnerability occurs, or is 'found', then the systems and applications are vulnerable and in the "zero"-Day stage. We are vulnerable until a fix is made by Microsoft. This could take weeks.

There are things one can do, like institute a snort rule in your IDS(assuming you have a Snort IDS of course).


The Rackspace cloud

Is really about developing a computing environment on the "cloud" which is actually in the rackspace server infrastructure.

Rackspace is all about the 'open' cloud computing.

Based upon the Openstack operating system.

The problem with the hundreds of cloud companies is that there is no true standard of how the interfaces and control of systems on racks all over data centers is to work.

If interested in cloud computing... then read this InformationWeek article


There is an interesting Apache exploit.summarized at Welivesecurity

Once the Linux/Xdorked.A server is "installed" in the infected mahcine it will allow a backdoor entry for the hacker.

The binary allows 23 commands to be sent to it.

Here is a tool: for you to download from their website(welivesecurity)

Also check the comments, in case you have an older Python implementation:

Here is the relevant comment:

0o666 is the octal representation of the shared mem permissions (rw-rw-rw-), but doesn’t work with Python 2.5 and earlier.
You can change the decimal representation (so use 438 instead of 0o666).

With the tool you will be able to figure out if you have a legitimate Apache executable or not.


Navisite's data center list

Navisite has 8 data centers, 2 in Chicago area (Oak Brook - West suburbs, and downtown), LA, and San Jose in west coast, Syracuse and New York in the New York area, Houston in south Mid-west area, and Andover MA (also east coast).

So really there are 5 distinct geographic areas, which happen to coincide with major Internet NAP's (Network Access Points).

That means there was some thought into how their network was set up.

There are also some new international facilities in Africa, Australia, Netherlands, Japan, and Switzerland.

Navisite has several apps which would qualify as cloud apps, and would in fact be Software as a Service (SaaS).
Such as Exchange and Oracle as well as more apps.

Navisite Cloud Architecture

Navisite is one of the Gartner Magic Quadrant companies(above)


McAfee solution to malware is:

integrated antimalware capabilities, which consist of:

1. Integrated
2. End-to-End
3. Real-time
4. Context aware
5. Holistically managed


I am sure McAfee is trying to convey in the most apt way how their solution will solve your malware problems.

The only wrinkle in their plans may be that they are human and thus may make some mistakes and unfortunately not cover all the bases.

In the security world there aren't just 4 baseball diamond bases... the hackers and criminals are inventing new baseball bases all the time and thus may still bypass the well-qualified and excellent McAfee solution.

If a new base gets invented in center field or in the stands somewhere... how will your network/assets cope then?

My suggestion would be to use multiple vendors to cover various assets, and even though that does not create an "integrated environment" it will safeguard better in the long run.

Security requires effort and constant attention from your own organization, one _cannot_ completely outsource this function.


In the last 25 years the most common vulnerabilities are?

From VRT-Blog at

Total vulnerabilities and highly critical vulnerabilities were up in 2012 after a significant downswing over the previous few years; 2012 was a record-breaking year for the number of most critical vulnerabilities, those with a CVSS score of 10.

Buffer overflows continue to be the most important type of vulnerability, with 35% of the total share of critical vulnerabilities over the last 25 years.

Interesting to note that Microsoft also has been bumped as the top vendor (highest number of vulnerabilities)

Oracle (with Java) has replaced Microsoft.


Sophos found a new Facebook scam... Facebook Black

Instead of changing the Blue background to Black it will just waste your time and make the scammer a little bit of money.

It may be possible for the scammer to hijack your facebook account as well. This is called a "survey scam".

Clicking on "Change your facebook color" might create some problems with your identity.

One of the scammer links points to web page, this means that it is based in a Japanese blog company.


The most interesting item in the Microsoft Security Intelligence Report is that 24% of systems scanned do not have any anti-malware software turned on...

Even a basic anti-malware software that is free... like Windows Defender or Microsoft's security Essentials.

You can download Microsoft Security Esseentials here


An interesting DDOS attack is surfacing using an uncommon protocol= Chargen.

The Internet Storm Center has a web page about the Chargen-based DDOS.

Apparently some systems have port 19 traffic open and thus has caused problems on their networks.

Unfortunately it seems to be financial companies seeing the problem.

This is an interesting comment on the page:

Chargen is easy to implement by accident on network gear - on cisco routers for instance it's implemented by "service tcp-small-services", which also enables the echo, discard and daytime services. "service udp-small-services" is the udp related command.

On Windows, it's common to see this service open when folks install "Simple TCP/IP Services" as part of their server build.

This is why it is important to do network audits to check to see if these protocols are on and thus need to be turned off.


Peer1 hosting is another up and coming hosting company in the Gartner Magic Quadrant

Peer1 has 19 data centers, 21 POP's (Point of Presences)

5 locations in Europe, 3 in Canada, and 11 in the USA.

Website performance is enhanced with Anycast DNS.DDOS attacks are thwarted, 10Gbps network, and 24x7x365 monitoring is included in the service.

The tech support is supposed to be a direct call to your personal account manager, not help desks or junior techs.

There is also a blog which discusses the hosting co.

Interesting that this hosting company decided not to have an Asian presence.


This morning was at the CAIT - Washington University roundtable discussion at the Knight Center.

Building an Application Security Program/Effective Security in SDLC was today's topic... where about 30+ security professionals were in attendance from around the area.

the first two hours was a presentation on SSDLC (Secure Software Development Life Cycle).

The next couple of hours was Jerry Hoff by White Hat Security discussing Static Code Analysis and various items he has worked on in the last 10+ years.

Here is an interesting video from AppsecUSA where he spoke as well.

He developed and many more.

The main message was that web apps are not secure because they were not built with security in mind.

He was also mentioning his blog at Computerworld.


Carpathia Hosting is one of the hosting companies that was added to the managed hosting Gartner Magic Quadrant this month.

there is something called an IBX Vault Data center located in Dulles, VA.
The "advantages" are:

1. Location - within 50 miles of Washington DC.
2. compliance - name the compliance standard, it is there.
3. Connectivity - multiple Tier 1 telecom carriers part of the Platform Equinix family.
4. services - multiple services

Personally, I don't see why being close to Washington DC is a great thing. It matters to me how close to the Internet NAPs one is instead of close to the capital of the USA. Of course this may be of importance to a company in the capital or Baltimore.

The Carpathia website does have a good explanation of PCI compliance


What about virtual desktop security?

Virtual machine security?

Any virtual instance also needs to be reviewed and tested,
don't forget the computer system just because it is a virtual machine. If it is on your network with an IP address it could make your network vulnerable. The weakest link can cause a breach.

The images are from a brochure.


The new Gartner magic Quadrant came out (April 2013)

Gartner cloud computing magic quadrant It looks like Amazon Web Services are not in the Quadrant now, when they were included in the top most area for Cloud IaaS on the October 2012 report.

The reports may not be equivalent, but it is interesting...

Also Tier3, Bluelock, Joyent, Softlayer, Fujitsu, Dell, and Virtustream were dropped.

Rackspace moved up a bit, AT&T was added and to the highest area.

Verizon Terremark and Savvis stayed in their top tier area.

Layered Tech, Carpathia hosting, Peer 1 hosting, Sungard, and Datapipe were added in the lower areas: I will review some of these companies in the coming days.


LogicWorks has a page that explains its cloud automation and scaling solutions :

Has performance and remote monitoring tools (powered by ScienceLogic)
Configuration Management tools (by RightScale), an automated cloud management platform
VMware hypervisor and Citrix CloudPlatform is used as well (most companies use this)
Auto-scaling - dynamic resizing is handled by Riverbed technology.
Managed content delivery services are taken care of by Akamai
Citrix Xenserver hypervisor also used

Their data centers are located in New York, New Jersey, and California in the USA
And Amsterdam and Singapore overseas for a European and Asian presence as well

The following link has all of the details of Logicworks carrier-grade capabilities.

LogicWorks was founded in 1993 and Private Cloud hosting started in 2007.

#9 in the TalkinCloud100 list


Everyone should be aware of Social Engineering and the potential affects to your identity, or company security.

Article in CSO online is great start to understanding social engineering.


Equinix Cloud company is #4 in Talkin Cloud 100 index from 2012.

Company was founded in 1998, and has 10 current data centers in the US, with two more building.
5 data centers in Asia, 1 under construction
8 data centers in europe, 1 under construction.
1 in the middle east (Dubai - UAE).

This company also interconnects with other cloud companies, such as GoGrid, and more.

Has data centers in 30 markets - 13 countries.

So if geographic diversity and International footprint is important this company definitely provides that option.
Redundancy and performance can be achieved with Application performance nodes.

Giving reliability at 99.9999% uptime - which is the six sigma numbers as I have discussed on my 6sigma page

In marketing speak Equinix calls its service: Platform Equinix.


Mindshiftdesktop as a service cloud computing. Also referred to as DaaS.

The cloudSHIFT desktop is either a standard Microsoft Terminal Server- remote desktop session
Or a Citrix HDX connection, the Citrix client would allow a different type of customization

MindShift company information:

The data centers are "geographically dispersed" with no actual details of where they are.


A couple of large entity cloud players that are known for other market spaces:
Both are considered niche players in the Gartner magic Quadrant.

Fujitsu cloud

Fujitsu ServerView Resource Orchestrator Cloud Edition. Management software that delivers IT cloud infrastructure.
Fujitsu Primergy is server architecture on the x86 standard. A 10U chassis can house 18 server blades.

Dell Cloud Computing

Dell has Desktop as a Service cloud services, and not just Microsoft's RDS, but also Citrix Xendesktop, VMWare View, and Dell DVS Enterprise ISS.

Dell also has an enterprise notification system(Alertfind) - automated message delivery of text, voice calls, emails, paging and faxes.

It looks like both Dell and Fujitsu have specialized cloud services which are focused on DaaS and mass Communication services as well as building the cloud infrastructure itself.

This is why Gartner considers them niche players.


Are you aware of the NIST(National Institute of Standards and Technology) definition of Cloud Computing?
Excellent NIST report

Cloud computing is a model for enabling ubiquitous, convenient,on demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction.
This cloud model is composed of five essential characteristics, three service models, and four deployment models.

5 Essential Characteristics

On-demand self-service-- change computer resources without human intervention
Broad network access-- can be used within many different platforms(smartphones, tablets, laptop, workstations)
Resource pooling-- computer resources are pooled to give customers more and less computing power
Rapid elasticity-- increase or decrease computing power
Measured service-- ability to meter service(storage, processing, bandwidth, and user accounts)

Service Model

Software as a Service (SaaS)
Platform as a Service (PaaS)
Infrastructure as a Service (IaaS)

Deployment Models

Private Cloud
Community Cloud
Public Cloud
Hybrid Cloud


Softlayer's cloud solution is MongoDB.

If you need MongoDB for big data applications Softlayer has a partnership with 10gen for MongoDB servers running CentOS distribution tuned for enhanced performance.

In less than 2 hours MongoDB integrates into your cloud infrastructure.

This is definitely how a cloud should operate, quickly and more efficiently than you could do on your own.

It also looks like Softlayer has a large amount of servers in 5 US locations and also in Europe(Amsterdam), and Asia(Singapore)

Softlayer Facilities:

Dallas 104,500+ Servers
Seattle 10,000+ Servers
Washington 16,000+ Servers
Houston 25,000+ Servers
San Jose 12,000+ Servers

Amsterdam 8,000+ Servers
Singapore 16,000+ Servers

Each datacenter has a "pod" which supports up to 5000 servers. Each pod has standardized Network, Storage, and Security infrastructure. The power and environment is also controlled and supports the pod.

even if many other cloud providers have a similar infrastructure, you can see Softlayer has a large scale ability set.

04/05/2013 has the information that you need to be compliant with the Payment Card Industry(PCI) Data Security Standard.PCI compliance document - v2.0 Here are the main points of the document:

  • Install and maintain a firewall
  • Protect cardholder data
  • Encrypt the transmission of cardholder data
  • Run and update an anti-virus program
  • Develop and maintain secure systems and applications
  • Restrict access to cardholder data with strong access control measures
  • Restrict physical access to cardholder data
  • Track and monitor all access to network resources and cardholder data
  • Regular testing of security systems and processes
  • Maintain security policy that addresses information security for all personnel

    All of these items are covered in many security books and thus in my opinion are just common sense.

    Although in the audit business one has to confirm people perform the work they are supposed to do.


  • Joyent Blog

    Joyent is a cloud company that has unique cloud computing products and services.

    Node.js is an application that uses SmartOS From their website: Node.js is a server-side JavaScript environment that uses an asynchronous, non-blocking, event-driven I/O model for scripting highly concurrent programs.

    Linkedin, Sprint, Microsoft and Voxer are using SmartOS (a unique cloud hardware and operating system that runs zFS and has high resiliency)


    Economist article about the digital arms trade -- essentially hackers which ply their trade to other hackers and criminals.

    Everyone who has a network connection on the Internet should be aware of the dangers of connecting to the Internet.

    Usor Emptor - User Beware (from the latin Caveat emptor - add Usor which is translation to user in latin).


    Web Application firewall Can solve a Denial of Service attack and compliance of websites.

    If you collect credit card information over the Internet then you have to be in PCI(Payment Card Industry) compliance.

    Penetration testing is also needed to be in compliance.

    Interesting to note that the latest blog post on KrebsonSecurity also mentions TDoS, which is a Telephone Denial of Service where many phone calls create an environment where one cannot run their business.

    In reading the whole article there is an easy way to get these attacks started, once one knows where to go (in the underground), the costs are not that high: only $50 for certain attacks.

    I would definitely be ready for these types of attacks as it may be you in the near future.


    Another Gartner Quadrant cloud company: CSC - Cloud Computing Services

    Fifty year old company which started on April 16, 1959. Here is the company profile

    The website does not specifically state where the data centers are, but there are 4 major offices: HQ at Falls Church VA, Australia, Asia, and Europe. These are broad strokes, but it does tell you they are a multi- national company with longevity.

    The list of items include many information technology services and solutions. The cloud service is not broken out and marketed like many other companies have done.

    This company looks to be a consulting company, and maybe they do cloud computing, but it does not seem to be their main focus.

    Cloud computing success story: ETS cloudcompute model

    The solution was to Deploy CSC CloudCompute, a VMware vCloud Datacenter Service.


    Bluelock is also one of the premier cloud companies in the Gartner magic quadrant

    Bluelock has 2 different types of virtual data centers

    the 5-Series and 2-Series

    5-series is the high end data center - high availability advanced services, better end user experience, has enterprise support levels. SLA uptime is at 99.99%.

    the 2-series is set up as a test platform with 99.9% SLA uptime, self-service backups, antivirus , patching, monitoring, and basic licensing only. And here one has the option to turn off automated reboots, so that the test environments are not worked on by the Bluelock engineers.

    The Bluelock cloud is run by VMware vCloud. The Bluelock data centers are located in Indianapolis, IN and Salt Lake city UT.

    Load balancing is run by an industry standard: F5, which is the BIG-IP local traffic manager VE virtual appliance. The appliance is inside the Bluelock Virtual data center.

    Internet billing is run by 5-minute sampling, and if the committed traffic amount is passed there are allowances for exceeding the threshold for brief moments without having to go over to the next network usage commitment(without financial penalty).


    Terremark is a Verizon cloud hosting company

    Infrastructure and cloud service link: Infrastructure Cloud services

    Terremark's cloud blog

    SLA levels can be customized for enterprise level cloud application capabilities.

    You can control your cloud systems with web based intuitive control, dynamically provisioning new servers, storage, processors, selecting from pre-configured templates or custom configurations.

    There is role-based access control which can extend your network with identity and directory services (LDAP/ active directory) - This is considered a hybrid cloud.

    3 data centers in Texas. 1 Miami data center NAP.

    Terremark does not have a standardized cloud solution for terminal services (DaaS) but could create an enterprise cloud that would give that capability. Although the expense would not be justified for a small implementation.


    The Gartner magic Quadrant by Gartner document costs $1995.00

    . So when Savvis has the report available on their website that also links to the actual Gartner weblink. It does mean something.

    To me the 3 challengers: GoGrid, Bluelock, and Joyent are the most interesting.
    Although the leaders should be reviewed as well: Amazon Web services, Savvis, CSC, Dimension Data, and Terremark.

    In the coming days I will review all of the Gartner top leaders and challengers in the quadrant. The Gartner Quadrant was completed on October 2012.


    Here are 50 questions on the Forbes article about what to ask a cloud company

    I want to focus on the SLA questions.

  • What is the uptime and performance SLA?
  • Is there a financial penalty when/if the SLA is missed?
  • Can computing resources be added and subtracted in an automated fashion?
  • What is the customer service response time, and is it available 24 hrs a day?
  • I also like the monitoring and visibility question, how can a customer monitor the uptime of their service?

    Here are some other questions to keep in mind as well.

  • In a Disaster recovery scenario, how quickly can operations be restored in case of a physical or network outage?
  • Is there vendor lock-in for data or applications?
  • How difficult would it be to leave the cloud vendor?

    I picked 8 questions that you need to ask a cloud computing company besides how much it costs for what you need to move to the cloud.


  • Datapipe has a 100% uptime guarantee

    The company has 8 data centers, located in Silicon Valley, Virginia, New Jersey(2), London(2), Iceland, Hong Kong, and Shanghai.

    The guarantee is built with multiple redundant network connections, as well as redundant routers and switch configurations.

    Datapipe is SSAE16 certified, the new SAS70 standard changed on 2010 and became effective June 15, 2011.

    You can access their Whitepaper on security and compliance at this link

    The SLA's are designed to be customized, but there is a 1 hour hardware guarantee, as well as 100% power and cooling infrastrucure ability.

    There is no online quick cost information. One has to fill out a form and someone will contact you back.

    It does not look like they offer DaaS, as this is a IaaS company with various managed services handled inhouse.


    The cloud paradigm shift in IT

    To fully use the cloud in your company one must make some organizational changes.
    Just like when email was first introduced the change of communication could not occur until people checked their email on their own computer.

    The obvious changes are in what to purchase, one does not have to upgrade software anymore, since it is done on the cloud.

    The soft changes are in how the office workers can fulfill their tasks, since now everyone can access all of their office data and computing capabilities from wherever there is an Internet connection.

    This organizational change will take time to organize in relation to different tasks. One could hire more home workers and reduce office footprint. Thus cutting office overhead while depending on office workers ability to work from home in an efficient manner to the task assigned. each business model has its own challenges.

    The cloud paradigm will allow lower costs in different businesses, it depends on how one uses computer applications will depend on how the cloud can help you.


    Today's review:
    IBM GoGrid is an IaaS or Infrastructure as a Service cloud company

    They have 3 data centers San Francisco-CA, Ashburn-VA, and Amsterdam - Netherlands.

    They are not a DaaS company, only IaaS, like Win2003, Win2008, or Win2012.

    There is a pricing calculator link.

    On the calculator you should know how much RAM and hard drive space you will need, outgoing nework usage is also metered, whereas incoming usage is free.

    But most interesting is that there is no managed services inhouse. GoGrid will refer you to the various partners that they have.

    Essentially you need someone like myself in providing the 'managed IT services' to help you in navigating the cloud servers.

    Cloud servers could cost as little as $181.25 per year. Of course the more one uses the server (network use) the more it will cost.

    GoGrid is in the Gartner Magic Quadrant, at the upper left , where niche and challengers are placed.

    definitely a company to keep an eye on.


    Contegix is a local(St. Louis area) cloud company.

    Their differentiation points have to do with superior service and technology guarantees.

    They have a 95%+ customer retention rate due to their managed service delivery.

    The physical capabilities (power, cooling, and communication) are redundant and capable even with various potential natural events.

    This is why they can offer a 100% network power uptime guarantee along with a 30-minute replacement guarantee for all critical business infrastructure.

    Most interesting is their response time (email) - 5 minutes with Tier 3 engineers.

    Also the lag to the Chicago NAP is at 5-6ms, which is also very good.


    Dimension Data is a cloud company with 99.99% uptime SLA.

    Their US datacenters are located in Virginia and California.

    At this time Windows7 is not a supported platform, so Desktop as a Service is not available.

    What happens is that one provisions a network and then a virtual machine slot.

    So Dimension data supports all of the standard server OS: RedHat, centOS, Win2003, and Win2008.

    But not the desktop OS of Win7 and Win8. What one can do is manage the server but not run desktop applications.

    Running the cloud systems on Dimension Data will require a certain amount of expertise.

    In my opinion the location of VI and CA is not ideal for a saint Louis company (where I am located). So i would look at other companies to host my servers.

    Dimension Data may be a good IaaS company, but one definitely needs to know what one is doing to get the full fruits of your labor. Here is a link to their free trial


    The top 20 CRN cloud data storage companies:

    Acronis an enterprise backup and recovery company cloud company.

    Asigra is a backup company for MSP's

    Bitcasa Infinite storage for all your devices

    Box a consumer data storage company

    Carbonite Backs up home and small business client devices

    Caringo Specializes in Very large data storage solutions

    Cobalt Iron Use private or public cloud backup solutions

    Ctera Network Create backups of remote office storage using a hybrid cloud storage solution

    Dropbox mobile backup - has business plans as well.

    eFolder Branded cloud backup solutions - for companies to brand their name as a cloud backup solution

    EMC US site EMC makes backup hardware

    EVault a seagate company providing online backups

    Intronis Cloud backup and recovery for the IT channel

    Mezeo Software Storage service layers, solutions for telcos & service providers

    Nasuni storage infrastructure delivered as a service

    Nirvanix Just added Cloud storage services to Intel AppUp SMB service

    Softlayer Big data: MongoDB for big data applications = more power

    SugarSync For business and personal storage solutions

    symform Seattle startup provides unlimited free secure offsite network

    The 20 (or actually 19 companies) have some interesting promises, including free storage.


    CRN is categprizing 100 cloud vendors by: platform, infrastructure, storage, security and software.

    I guess that equates to PaaS, SaaS, IaaS, DBaaS, and I am not sure hoow the Security firms fit into how others categorize cloud vendors.

    Security vendors:
    AnchorFree vpn service co.
    Appthority App security application
    Bromium Introduces vSentry - a hw protection against malware.
    Checkpoint firewalls and other security products.
    CipherCloudcloud data protection for salesforce.
    CloudPassageSecuring your cloud servers with a policy and no HW (SaaS model)
    FireHostVmWare based secure cloud environment.
    Fortinet Hardware network security solutions
    McAfee is an antivirus software company
    Nimbula cloud computing software on the style of EC2(Amazon)
    NTT America Has a NOC that can help manageyour enterprise cloud
    OktaA cloud Identity Access management company
    Oxygen CloudEnterprise Cloud drive any device access to cloud.
    PorticorVirtual key management - Homomorphic Key Encryption.
    ScaleXtremeCloud based server monitoring
    Skyhigh NetworksA way to discover and understand all the cloud services used by your company
    Symantec Enterprise Antivirus solutions
    Trend MicroAntivirus software
    Veracode Application security testing program
    Zscaler Gives a snapshot of network traffic on your enterprise.

    Each of the vendors have a cloud product offering, and can help companies with their cloud integration.


    The Sky High networks product offering is most interesting.

    An interesting IBM study available on their website

    Study was based on 1700 face-to-face conversations with CEO's in 64 countries.

    These CEO's are managing change with more open and collaborative environments in their companies.

    Face-to-face is still the main way to interact with customers, but social media will be a change they are embracing.

    Quick asynopsis:

  • Empowering employees through values
  • Engaging customers as individuals
  • Amplifying innovation with partnerships

    Technology is taking the top spot of all external forces that could impact the company.
    2nd spot: People skills
    3rd: Market factors

    How are they implementing or taking into account the changes to come?

    operational control is giving way to Organizational openness. A great quote:

    {Instead, CEOs are increasingly focused on finding employees with the ability to constantly reinvent themselves. These employees are comfortable with change; they learn as they go, often from others’ experiences. As a healthcare CEO from Australia explained, “Today’s connected economy is full of ambiguity, and the characteristics required to navigate that ambiguity are collaboration, creativity and communication.” }


    I can help you with technology and social media integration within your company using cloud computing.

    Tony Zafiropoulos C: 314-504-3974

  • CIO Index has a short blurb on its free section about a TCO for a cloud based solution:

  • For a 52 user deployment the cloud based solution is 55% lower than a local infrastructure solution.

  • for a 100 user deployment the application implementation and support is projected at 3.5 times more for an on premise deployment.

    Of course one cannot see the details unless paying for the research.

    The economies of scale are always on the side of the cloud based solution.

    The only question is how much cheaper will it be.

    The key is to determine business goals and decide how to implement the goals.


  • Total Cloud cost of ownership(TCCO) is what all owners and managers want to see to help them in deciding when and how to go to the cloud, you can also call it Cloud Return on Investment(CROI).

    Since a cloud investment is a monthly cost, it is important to make a comparison with a monthly service cost.

    We need to keep the comparison costs on monthly targets.

    What we need is a formula to plug in our own numbers and information to give us the data necessary for a good decision.

    So if you have an application server for 50 users that your IT department manages, what is the monthly IT labor cost?
    What (if any) are the utility costs?
    The cost of the purchased system should be amortized for 3 years, (Example: an HP Proliant DL385 = $6000)
    The network cost should not be used, since one needs a network in either case (cloud or not)
    Software purchased also should be amortized over three years, since if the hardware becomes obsolete, the software on the hardware should be considered obsolete as well.(Example: Windows2000 Operating System is not useful on a 2008 computer system)--$2900 for 50 user windows 2012.
    Space consideration should only be figured out if there is little space for a data center location. So far we have $6000 for HW and $2900 for SW, and we did not spend any money on an application software, nor initial software installation. I will make an assumption that the application software and initial install costs will be$3100.

    That would mean the total initial cost would be $12000. or incidentally $1000 per month.

    Now the only thing that is uncertain is the labor cost. How much would labor cost be for standard maintenance and adds and changes for 50 users? Training is another cost that is a soft cost. Either one pays for it or one pays for it in longer time it takes to finish a problem. So I will assume $200 per user or $10000 for the year.
    If one needs only $240 per user now the yearly cost is $12000. which increases the cost another $1000 a month.
    Now the key is the amortization of the $12000 computer system. will it be over 36 months? Or $333/month.

    If labor plus amortization costs are $1333 per month. That is the monthly cloud costs we need to stay under to save money.


    An interesting article about where the IT in-house resources and Cloud computing should fit under the IT department umbrella.

    HP article about cloud computing

    If an in-house IT service is able to provision a new server within hours and the software that is needed as well. Then should it not be considered cloud-like?

    Since the definition of cloud computing is to quickly provision new applications and spinning up of new resources. this means that the user asks for an application and then receives it in minutes not days or hours. The 'new' thinking of cloud or nothing is not the correct logic to follow. One must ask the correct questions.

    Questions like:

    what are the goals of this endeavor?
    what is our capacity with this infrastructure (example: 1000 exchange mailboxes)
    How fast can we spin up new servers with VMWare in our current infrastructure?
    How fast can we install and thus provide resources when needed by the various departments?
    These are the two important sentences in the article:

    "As the upstart challengers, cloud proponents have aggressively promoted the model as the way forward. There’s nothing wrong with that—there’s no progress against the status quo without strong belief and a willingness to break prevailing assumptions."

    "But there’s also more than a little “If you’re not with us, you’re against us” cliquishness and “It’s cloud or nothing” bravado."

    One does not need to be a cloud zealot, instead one should be a cloud and IT questioner. Make sure you ask the right questions and match them with the goals and intents of your company needs.


    Why is it backups do not get the proper emphasis?

  • No revenue comes out of a backup.
  • It takes time and effort from all involved, and there is not enough time as it is.
  • Testing a backup is required to make sure the backup works.
  • Leadership must be behind the effort
  • each item is difficult, altogether requires a major effort and will

    Do you know the status of your backup and testing of a recovery effort?

    Are the reasons so hard to accomplish that most businesses do not have the test -recover status of one of their backups.

    Here are some backup articles in case you want more reasons:

    Apple link to Backing up your Mac OS X machine.

    How-toGeek article on what files to backup

    Microsoft backup and restore article

    Just one solution for cloud backups:
    SavvisDirect solution for backing up your files.


  • Is it true? most small businesses do not have Disaster Recovery(DR) plans?

    It would make sense since most small business do not have time and money for many things, why create a DR plan?

    And yet this is the one cloud application all small businesses should have.

    The Quorum is a company that focuses on DR and just received $11mil in funding on 3/5/13.

    Aberdeen Group says it will cost on average of $74,000 for mid-sized companies and $6900 for small(less than 100 employees) for each hour of downtime.

    The number of downtime events are approximately 2-3 hours for small organizations, and the estimate is $6900/hour of downtime. This information comes from the Quorum website the AberdeenGroup Insight results.

    And most interesting:
    Studies have shown that 93% of companies that lose their datacenter for 10 days file for bankruptcy within one year. This is according to The United States National Archives

    Quorum also has an interesting argument, if one uses a cloud backup solution, how do you recover the data quickly enough to recover all your information systems that went down? The planning of disaster recovery is what is important. Quprum claims to do that better than others, it bears keeping in mind when looking into DR services.


    There are a few cloud companies claiming 100% uptime guarantee:

    Datapipe an IaaS cloud company

    Tier3 has a 99.999% SLA, also an Iaas cloud company has a 100% uptime guarantee excluding scheduled maintenance - Also an IaaS.

    Rackspace has a 100% network uptime SLA. Also an IaaS

    Codero has a 100% guarantee uptime excluding scheduled maintenance of cooling, computer hardware and network. another IaaS.

    One has to be careful in selecting cloud companies, as they may have a shared web hosting offering, which also claims 100% or close to it(99.9%). A shared web hosting environment is not in my opinion a cloud environment.


    The cloud is a great concept and can achieve cost savings within several IT areas in a company. Where can the cloud help your company?

    Let's discuss several types of cloud applications that have cost saving implications:

  • Backup - backup your files and automatically have disaster recovery abilities
  • Filesharing - Share your files just like on a fileserver at the office, except now the fileserver is in the "cloud" as a computer on the Internet
  • Sharepoint - use the power of Microsoft's Sharepoint on the cloud, by using a server on the Internet as your Sharepoint server.
  • Calendar - share your calendar either within Microsoft Exchange or Google's gmail on the Internet instead of just on your network (which makes email/calendaring a cloud application)
  • Salesforce-CRM - Salesforce is a Customer Relationship Manager cloud application(or other CRM applications), which you can access from anywhere on the Internet to better manage the customer sales connection
  • Office Applications - Use them on your DaaS cloud server, it is your desktop on an Internet server.
  • Meetings online - connect with your team members by connecting with a cloud collaboration application like Webex.
  • Antivirus - There are antivirus solutions on the cloud, where you do not have to host your own anti-virus server.
  • Video - MVaaS Managed Video as a Service, where multiple digital cameras or other networked video recorders (security cameras) can be hosted on a server on the cloud/Internet.
  • ERP - Enterprise Resource Planning applications on the Internet in a cloud form.

  • Financial applications - There are many financial applications on the cloud, where you no longer have to house the server yourself, which allows you to share your accounting info with your accountant.
  • testing and developing applications - A Platform as a Service cloud provider will help you design applications quicker than ever before.


  • Rackspace buys ObjectRocket - CRN article

    Check out the latest news as Rackspace (the #2 cloud company according to TalkinCloud100, has purchased a DBaaS. which apparently has a cutting edge cloud service. Most interesting the rackspace CTO says it has NoSQL databases.

    Ebay and Paypal are in ObjectRocket's Customer list.

    Here are some NoSQL reference sites:

    and Techrepublic's 10 things you should know about NoSQL databases: TechRepublic

    Interesting points:

    No more DBAs?

    of course it needs to mature, and expertise is needed, but it is scalable, economical, and has flexible data models.

    I must research this in more detail.



    Cloud migration versus local server 5 user system comparison:

    5 users, Office software only (email service not included) .
    10GB storage for each user. Backup of server on cloud either way. (when buying server on it is a click of a button and costs $1500/year)

    The least expensive server costs $1549 + 1500 Backup service is $3049. The problem with buying the least expensive server is that it will likely last only 2 years, depending on your needs.

    The big question is how much setup and maintenance will the local system need versus the cloud setup? Let’s assume the setup is equivalent – how about the maintenance? Even if there is only 1 hour of maintenance per month on the local server… that would cost $1200 at $100/hr.

    So the cost for cloud versus local server: $358/month = $4296/yr for Cloud --- and $4248 first year costs for local server not including Office costs. This does not include the upgrade for any office software needed in case currently running Office 2007 or lower. 5 user upgrade for Office2010 $550.

    So the difference in the first year is $550 more for local server plus $500 for installation

    The pros of cloud computing is $1000 less in first year
    Not having to worry about upgrades and server.
    Yes the cost is $358/month but the backup and many things are taken care of. You don’t have to have a computer tech available when the server has problems.
    When there are issues with cloud software contact tech support.
    Can use your files anywhere on road or office.

    The cons for cloud computing is that the second year will still cost $358/ yr… versus an unknown amount.
    Unknown maintenance and downtime costs.

    How many hours will the system be down when you need it?

    me for specific details of which company was used for the comparison


    Caution ahead ... Future prediction:

    Two years from now most businesses will not have an onsite server...

    It is cheaper,
    less hassle,
    automatically capable of disaster recovery,
    remote access capable,
    no more server downtime ( 99.999% uptime - literally 5 minutes per year - that little means no downtime in my book)

    The only downtime will occur on your end, and then all you have to do is switch computers.

    If set up correctly with 2 computers and 2 Internet connections it would mean no downtime.


    Cloud Project management - what to keep in mind when going to the cloud:

    Now that I have a cloud migration under my belt (moved a client's application to cloud provider) here are my recommendations, when planning on moving to the cloud:

    1. Insist on detailed plan for data migration. (userfiles, customizations, customer client data), ask _how_ data will be migrated from the current systems to the cloud systems.
    2. Email migration must be seamless - usernames, files, customizations (auto-name recognition for email)
    3. Application database moved with the correct version, stop the services then migrated and restart on new system.
    4. When will the migration happen, how will the domains be pointed?
    5. What will be the new password policy? Now that anyone on the Internet can access your system how will you protect the system?
    6. How will you audit the cloud system, and with what frequency. (for security and backup reasons)
    7. When will the system be maintained? Do you need 24x7x365 access?


    It is an interesting map, this map should help you in deciding where your future cloud company should be located in the US.

    The map was created at Infocellar

    For example we are in St. Louis, which means the Ameritech NAP(Chicago) is the closest, whereas PAC Bell NAP might be closer than MAE-West. The Sprint NAP on the east coast or the MAE-east NAP may not have a direct connection to Saint Louis peering arrangements.

    For the most part we are talking milliseconds, but it could be important if one has an application that does not do well with lag, such as a command line with feedback (just as an example). Of course when developing apps one must keep in mind the network usage needs of the app.

    The differences between the NAPs may be miniscule, but one should review the data when a specific need is desired.


    Cyberwar was on the political roundtable in this morning's show:
    On RealClearPolitics in case you missed the ABC news segment at 5:23 until about 12:30

    Congressman Mike Rogers the Chair of the Intelligence committee discusses the Peoples Liberation Army (PLA) computer attacks on US companies.

    There are other topics on the video, but cyberwar was on the show and it showed the seriousness of the attacks on our computer infrastructure.

    They were discussing Denial of Service attacks and other attacks that criminals use to attack computers so as to take financial information. As well as countries which are using computer attacks as a weapon of war.

    This is basic stuff, but everyone must have up-to-date computer infrastructure, spend money to keep everything up, otherwise something will happen to you ... as it is a matter of time. Either a criminal or another country will attack you to see what they can learn for their advantage.

    Minimally one must keep all maintenance up so as to control your destiny. Why take a chance on viruses, or espionage/other nefarious activity.

    Or take the time to outsource to the cloud this activity, if you do not have the time or resources to complete the task.

    This is a wake up call for anyone with a computer on the Internet.


    A powerful reason to move to the cloud is to move your application to a mobile platform, or to an offsite server location: Powerbuilder migration Link.

    The above link explains some of the reasons to move to the cloud for old legacy applications which are then re-written with cloud based programming methods. I.e. J2EE (Java), or .net (Microsoft technologies), or even LAMP - Which stands for Linux - Apache - MySQL - PHP or Python/Perl.

    It all depends on what you need to get done to make the application cloud friendly.

    You can contact me in Saint Louis area to migrate your application to the cloud.

    Tony Zafiropoulos - 314-5043974 tonyz"@"


    Today it is a snow day in Saint Louis, which means I stay in and fix the FBI Cybercrime virus...

    This laptop had a webcam builtin, like many laptops and the fbi cybercrime virus took a picture of the person sitting at the computer (which I replaced with my picture... and made it obvious since a webcam picture is typically out of focus and does not have enough light). What this virus wants you to do is go get a moneypack from various retail outlets (which it helpfully provides suggestions). Then once you put $200 in a moneypak it will release your computer supposedly, although I would not bet on that.

    Many viruses will also remove other valuable items on your computer, including passwords, bank access items among other things.

    What is most interesting about this virus is that it claims that you viewed child pornography and thus violated a criminal code, which means that you should go to jail for 4 to 9 years. But if you send the FBI money they will helpfully not send you to jail.

    I hope no one sends this virus writer any money, as it only takes 1 person out of thousands to make this scam worth it. People need to be educated about how to use their information systems devices.

    This particular virus was a bit tough to clean, as even in safe mode the computer would just reboot quickly.
    But I was able to use my quick fingers and a method where you reboot the computer using ALT-CTL-DEL and then cancel the reboot process.

    I then ran hijack this, HitmanPro, and Malware bytes to clean the virus, also reviewing the registry and file system for odd entries

    It looks like this version called itself "Gaming Wonderland" and installed an infrastructure under C:\Program Files (x86)\Gaming Wonderland\


    8. Questions to ask a cloud company when looking for cloud services:

    1. Where are you located? (this may be important in regards to logistics)
    2. Where are the data centers located? (this may be important if you have customer data and may not want it to reside overseas)
    3. Can I specify where my data will reside? (same as above) 4. What is the SLA or response time for problems, issues? (this is important as it is important to know the timeline of a support person looking into an issue)
    5. Are there costs for usage? is it a monthly cost flat fee, or are there costs with bandwidth uses? (this is important to know your complete future costs.
    6. What are the backup procedures and rules: 2 weeks retention? file by file recovery? (it is important to know what you are getting into)
    7. what are the scheduled maintenance windows? (good to know when the system is unavailable on which weekends)
    8. What security capabilities does remote access of the server have? (this is important to know as 128bit rdp access should be a minimum)


    The application everyone should have on the cloud: Backing up your files!
    By backing up your files on the cloud you automatically have solved the disaster recovery angle for your business.

    When (not if) your business office computers have problems then you are still operational by having your files on the cloud computers.

    If you have planned for this operational problem, it will not take as long to get back up and running versus if you have not planned for the problem.

    One of the companies on Talkincloud100 list: #40

    They are HIPAA compliant, which would be important if you need that compliance standard.


    OpenStackSoftware, the scalable cloud operating system created by Rackspace and NASA. Rackspace was the #2 company in the TalkinCloud 100 list for 2012.

    The software is also run as open source under the Apache2.0 license (Apache web servers run over 60% of the web servers on the Internet.

    who can use OpenStack software? Developers and companies creating software that will run on the Internet, like ecommerce applications, or searching applications, inventory apps, the problem with software is that one can create anything but the reality is anything is hard to quantify. So one must understand the needs within a company and meet those needs with the correct technologies and software capabilities.


    Cloud computing migrations

    Led migration of a 9 computer network in Saint Louis to the cloud using DaaS and IaaS Cloud services.

    Cloud computer positives
    1. Business agility – add /remove or change resources faster than current IT infrastructure
    2. Cloud computing is an operational service expense, not a capital expense
    3. Automate your backups so that any person in company can recover files
    4. Disaster recovery solved with online backups that are automated by Cloud provider
    5. Use your desktop on old computers to fully realize their ROI
    6. Eliminate downtime – needs cloud provider strength, and multiple Internet connections
    7. Use computer and management resources for additional business possibilities
    8. Keep Security of your IT data in mind - passwords are more important than ever.

    Cloud computer negatives
    1. Under regulatory business models must use hybrid cloud models which cost more.
    2. All users must be aware of security issues


    How to measure speed to all the different cloud companies?

    One way is to measure the speed between your computer and a webserver of the cloud company.

    I have created a batch file that one can use to run a ping test to all of the companies download the ping batch file here

    After downloading the file open up a command windows (with cmd and then enter)

    once you are in the correct directory (where you downloaded the batch file enter the following command:
    pingall.bat > pingoutput.txt

    Now you can view the pingoutput.txt file, the output will be in the following manner:

    Although I removed all request timed out, since the cloud co may stop pings (icmp traffic) at their firewalls).

    Notice that most pings came back in an average of 41-73 milliseconds, there were a few anomaly readings in the 137+ range, as well as 23-39. One does have to remember that a ping speed is the speed of a packet going back and forth from your computer to theirs. This speed can be affected by intervening networks and general "traffic" on the Internet. But the idea is to have some kind of data to evaluate these companies. It would be interesting to try these pings from a different provider (my Internet connection is with AT&T)

    And I am in the Saint Louis area, so my proximity to the Network Access Points(NAP) where all of the major ISP's are interconnected.

    If you could email me your output file, and let me know where in the country you are from, that would be great.

    Output datapoints: Average in milliseconds:

    23, 30, 36, 37, 38, 39, 39, 41, 45, 47, 49, 49, 49, 50, 51, 55, 57, 57, 57, 59, 59, 60, 61, 61, 63, 68, 70, 71, 71, 73, 93, 137, 145, 148

    This is the output from the 10 pings to each company's webserver

    C:\Users\tonyz>ping -n 10

    Pinging [] with 32 bytes of data:
    Reply from bytes=32 time=55ms TTL=54
    Reply from bytes=32 time=63ms TTL=54
    Reply from bytes=32 time=29ms TTL=54
    Reply from bytes=32 time=47ms TTL=54
    Reply from bytes=32 time=28ms TTL=54
    Reply from bytes=32 time=42ms TTL=54
    Reply from bytes=32 time=28ms TTL=54
    Reply from bytes=32 time=29ms TTL=54
    Reply from bytes=32 time=28ms TTL=54
    Reply from bytes=32 time=28ms TTL=54

    Ping statistics for Packets: Sent = 10, Received = 10, Lost = 0 (0% loss), Approximate round trip times in milli-seconds: Minimum = 28ms, Maximum = 63ms, Average = 37ms

    From here on out I removed each reply and just kept the average statistcs.

    C:\Users\tonyz>ping -n 10 Pinging [] with 32 bytes of data:

    Ping statistics for Packets: Sent = 10, Received = 10, Lost = 0 (0% loss), Approximate round trip times in milli-seconds: Minimum = 29ms, Maximum = 39ms, Average = 30ms

    C:\Users\tonyz>ping -n 10

    Pinging [] with 32 bytes of data: Ping statistics for Packets: Sent = 10, Received = 10, Lost = 0 (0% loss), Approximate round trip times in milli-seconds: Minimum = 60ms, Maximum = 325ms, Average = 148ms

    C:\Users\tonyz>ping -n 10

    Pinging [] with 32 bytes of data: Ping statistics for Packets: Sent = 10, Received = 10, Lost = 0 (0% loss), Approximate round trip times in milli-seconds: Minimum = 61ms, Maximum = 120ms, Average = 70ms

    C:\Users\tonyz>ping -n 10

    Pinging [] with 32 bytes of data: Ping statistics for Packets: Sent = 10, Received = 10, Lost = 0 (0% loss), Approximate round trip times in milli-seconds: Minimum = 48ms, Maximum = 56ms, Average = 49ms

    C:\Users\tonyz>ping -n 10

    Pinging [] with 32 bytes of data: Ping statistics for Packets: Sent = 10, Received = 10, Lost = 0 (0% loss), Approximate round trip times in milli-seconds: Minimum = 63ms, Maximum = 126ms, Average = 71ms

    C:\Users\tonyz>ping -n 10

    Pinging [] with 32 bytes of data: Ping statistics for Packets: Sent = 10, Received = 10, Lost = 0 (0% loss), Approximate round trip times in milli-seconds: Minimum = 37ms, Maximum = 93ms, Average = 60ms

    C:\Users\tonyz>ping -n 10

    Pinging [] with 32 bytes of data: Ping statistics for Packets: Sent = 10, Received = 10, Lost = 0 (0% loss), Approximate round trip times in milli-seconds: Minimum = 63ms, Maximum = 127ms, Average = 73ms

    C:\Users\tonyz>ping -n 10

    Pinging [] with 32 bytes of data: Ping statistics for Packets: Sent = 10, Received = 10, Lost = 0 (0% loss), Approximate round trip times in milli-seconds: Minimum = 17ms, Maximum = 40ms, Average = 23ms

    C:\Users\tonyz>ping -n 10

    Pinging [] with 32 bytes of data: Ping statistics for Packets: Sent = 10, Received = 10, Lost = 0 (0% loss), Approximate round trip times in milli-seconds: Minimum = 66ms, Maximum = 102ms, Average = 71ms

    C:\Users\tonyz>ping -n 10

    Pinging [] with 32 bytes of data: Ping statistics for Packets: Sent = 10, Received = 10, Lost = 0 (0% loss), Approximate round trip times in milli-seconds: Minimum = 47ms, Maximum = 108ms, Average = 63ms

    C:\Users\tonyz>ping -n 10

    Pinging [] with 32 bytes of data: Ping statistics for Packets: Sent = 10, Received = 10, Lost = 0 (0% loss), Approximate round trip times in milli-seconds: Minimum = 81ms, Maximum = 134ms, Average = 93ms

    C:\Users\tonyz>ping -n 10

    Pinging [] with 32 bytes of data: Ping statistics for Packets: Sent = 10, Received = 10, Lost = 0 (0% loss), Approximate round trip times in milli-seconds: Minimum = 33ms, Maximum = 52ms, Average = 36ms

    C:\Users\tonyz>ping -n 10

    Pinging [] with 32 bytes of data: Ping statistics for Packets: Sent = 10, Received = 10, Lost = 0 (0% loss), Approximate round trip times in milli-seconds: Minimum = 49ms, Maximum = 92ms, Average = 59ms

    C:\Users\tonyz>ping -n 10

    Pinging [] with 32 bytes of data: Ping statistics for Packets: Sent = 10, Received = 10, Lost = 0 (0% loss), Approximate round trip times in milli-seconds: Minimum = 57ms, Maximum = 87ms, Average = 61ms

    C:\Users\tonyz>ping -n 10

    Pinging [] with 32 bytes of data: Ping statistics for Packets: Sent = 10, Received = 10, Lost = 0 (0% loss), Approximate round trip times in milli-seconds: Minimum = 43ms, Maximum = 105ms, Average = 57ms

    C:\Users\tonyz>ping -n 10

    Pinging [] with 32 bytes of data: Ping statistics for Packets: Sent = 10, Received = 10, Lost = 0 (0% loss), Approximate round trip times in milli-seconds: Minimum = 130ms, Maximum = 191ms, Average = 137ms

    C:\Users\tonyz>ping -n 10

    Pinging [] with 32 bytes of data: Ping statistics for Packets: Sent = 10, Received = 10, Lost = 0 (0% loss), Approximate round trip times in milli-seconds: Minimum = 48ms, Maximum = 83ms, Average = 59ms

    C:\Users\tonyz>ping -n 10

    Pinging [] with 32 bytes of data: Ping statistics for Packets: Sent = 10, Received = 10, Lost = 0 (0% loss), Approximate round trip times in milli-seconds: Minimum = 48ms, Maximum = 96ms, Average = 61ms

    C:\Users\tonyz>ping -n 10

    Pinging [] with 32 bytes of data: Ping statistics for Packets: Sent = 10, Received = 10, Lost = 0 (0% loss), Approximate round trip times in milli-seconds: Minimum = 39ms, Maximum = 85ms, Average = 57ms

    C:\Users\tonyz>ping -n 10

    Pinging [] with 32 bytes of data: Ping statistics for Packets: Sent = 10, Received = 10, Lost = 0 (0% loss), Approximate round trip times in milli-seconds: Minimum = 19ms, Maximum = 131ms, Average = 39ms

    C:\Users\tonyz>ping -n 10

    Pinging [] with 32 bytes of data: Ping statistics for Packets: Sent = 10, Received = 10, Lost = 0 (0% loss), Approximate round trip times in milli-seconds: Minimum = 38ms, Maximum = 100ms, Average = 55ms

    C:\Users\tonyz>ping -n 10

    Pinging [] with 32 bytes of data: Ping statistics for Packets: Sent = 10, Received = 10, Lost = 0 (0% loss), Approximate round trip times in milli-seconds: Minimum = 37ms, Maximum = 62ms, Average = 41ms

    C:\Users\tonyz>ping -n 10

    Pinging [] with 32 bytes of data: Ping statistics for Packets: Sent = 10, Received = 10, Lost = 0 (0% loss), Approximate round trip times in milli-seconds: Minimum = 32ms, Maximum = 91ms, Average = 47ms

    C:\Users\tonyz>ping -n 10

    Pinging [] with 32 bytes of data: Ping statistics for Packets: Sent = 10, Received = 10, Lost = 0 (0% loss), Approximate round trip times in milli-seconds: Minimum = 135ms, Maximum = 196ms, Average = 145ms

    C:\Users\tonyz>ping -n 10

    Pinging [] with 32 bytes of data: Ping statistics for Packets: Sent = 10, Received = 10, Lost = 0 (0% loss), Approximate round trip times in milli-seconds: Minimum = 61ms, Maximum = 108ms, Average = 68ms

    C:\Users\tonyz>ping -n 10

    Pinging [] with 32 bytes of data: Ping statistics for Packets: Sent = 10, Received = 10, Lost = 0 (0% loss), Approximate round trip times in milli-seconds: Minimum = 32ms, Maximum = 94ms, Average = 51ms

    C:\Users\tonyz>ping -n 10

    Pinging [] with 32 bytes of data: Ping statistics for Packets: Sent = 10, Received = 10, Lost = 0 (0% loss), Approximate round trip times in milli-seconds: Minimum = 34ms, Maximum = 86ms, Average = 49ms

    C:\Users\tonyz>ping -n 10

    Pinging [] with 32 bytes of data: Ping statistics for Packets: Sent = 10, Received = 10, Lost = 0 (0% loss), Approximate round trip times in milli-seconds: Minimum = 49ms, Maximum = 60ms, Average = 50ms

    C:\Users\tonyz>ping -n 10

    Pinging [] with 32 bytes of data: Ping statistics for Packets: Sent = 10, Received = 10, Lost = 0 (0% loss), Approximate round trip times in milli-seconds: Minimum = 33ms, Maximum = 88ms, Average = 45ms

    C:\Users\tonyz>ping -n 10

    Pinging [] with 32 bytes of data: Ping statistics for Packets: Sent = 10, Received = 10, Lost = 0 (0% loss), Approximate round trip times in milli-seconds: Minimum = 28ms, Maximum = 73ms, Average = 38ms

    C:\Users\tonyz>ping -n 10

    Pinging [] with 32 bytes of data: Ping statistics for Packets: Sent = 10, Received = 10, Lost = 0 (0% loss), Approximate round trip times in milli-seconds: Minimum = 37ms, Maximum = 59ms, Average = 39ms

    C:\Users\tonyz>ping -n 10

    Pinging [] with 32 bytes of data: Ping statistics for Packets: Sent = 10, Received = 10, Lost = 0 (0% loss), Approximate round trip times in milli-seconds: Minimum = 51ms, Maximum = 111ms, Average = 57ms

    C:\Users\tonyz>ping -n 10

    Pinging [] with 32 bytes of data: Ping statistics for Packets: Sent = 10, Received = 10, Lost = 0 (0% loss), Approximate round trip times in milli-seconds: Minimum = 27ms, Maximum = 99ms, Average = 49ms


    the rest of the cloud companies with a small review... (from #33 through #100 from the talkincloud100). I have omitted almost all the Office365 cloud service companies, as well as the consulting service companies... What I am looking for are cloud companies which have Infrastructure as a Service, or Platform as a Service, it is interesting to also see some storage and backup vendors.

    100. Toglcloud DaaS.
    95. Inetinc DaaS, IaaS
    93. Bcgsystems SaaS
    92. Lightbound
    91. Uptimesystems DaaS
    87. Levelcloud Master Cloud Service Provider
    85. Paragrid backup , DaaS
    82. Etegrity storage and backup
    73. Projectorpsa
    72. Lanlogic
    71. Pickits
    67. Centrastage SaaS
    66. Netstandard
    59. Nephoscale
    58. Ntiva
    57. Allcovered
    55. Acumensolutions
    54. Ntirety
    53. Teklinks
    52, Zslinc
    51. Itauthorities SaaS - Exchange, Blackberry, Sharepoint
    47. 3Tsystems
    46. Nasstar hosted desktop in UK (DaaS)
    45. Apps4rent Microsoft SaaS
    44. Longviewsystems Canadian IaaS
    42. Connectwise
    41. Clarisnetworks 99.99% SLA
    40. remote-Backup backup
    35. Tier3 IaaS - PaaS
    34. Logicalis - IaaS
    32. Excelmicro - SaaS - symantec
    33. Solvedirect Service Integration - pull multple services together


    Always reviewing cloud companies... here are some:

    Some companies from the TalkinCloud 100 list and their emphasis: (I did not include some companies from 2-22 (#1 is Amazon, no need to add them as they are a specialist company).

    22 Blulock
    21 Datapipe IaaS,
    20 IOmarthosting UK co
    17 telesphere pbx hosting
    14 Newrelic PaaS
    13 Iland IaaS
    11 Mindshift exchange vdi
    10 Apptix Apps SaaS
    9 Logicworks IaaS
    8 Layeredtech IaaS
    7 Softlayer IaaS
    6 Csiweb financial company emphasis
    5 Navisite Exchange
    3 Savvis IaaS
    2 Rackspace IaaS


    CenturyLink is a MSP among other things. MSP - Managed Service Provider: could be any cloud company that provides specific services like install and manage servers in a cage in their own datacenter.... Such as CenturyLink's Managed Hosting Services Page:

    CenturyLink Managed Hosting Services

    This would include monitoring Webservers on the Internet (like your Ecommerce site, or monitoring the availability of servers for mobile applications. Anything that you want to monitor, and set up on the Internet can be managed by a MSP.

    This includes environmental monitoring, like temperature and humidity control in a data center.

    Typically in the past an Internet Service Provider would monitor their routers on your premise (which they leased to you), now this monitoring ability has been made into a service by the same RBOC's (Regional Bell Operating Carriers - since CenturyLink is a RBOC).

    Of course you do know that CenturyLink owns Savvis, and thus I wonder if this is just a rebranding or seperate entity within the two company infrastructure.

    After looking at the Savvis website, they are also branding Savvisdirect "which has cloud based solutions for your company"

    have to investigate this a bit more...


    When you have moved to the cloud and even if you have not, the most important security defense are the safekeeping of your passwords.

    So in keeping with that, one must not write down any passwords, but you say I have to change mine for my network every 60 days? And what about my bank, my facebook, twitter account, and many other accounts...??

    Well there is an answer to this dilemma amd it is even free to use and download, as it is open source: Keepass Password Safe, with Keepass you can keep some software on your desktop where you can save all of your passwords in one spot, and it is safe from hackers because one has to enter a single password to run it and retrieve passwords.

    You can see from the following picture (from the website?) that one can copy the password to the clipboard and then paste into the password field when needed... so now one can make truly difficult and random passwords, thus hard to crack.


    Almost completed a cloud migration project to Uptimesystems. Due to some unanticipated hardware problems on the server (which was another reason to move to the cloud) it took a bit longer than anticipated.

    The actual downtime for the users was a bit higher than anticipated, 7 hours instead of 5 hours.

    But now the client can access their files and Time management server from home as well as the office, there is no difference.


    Have you ever wondered how to measure your Internet server or system that you are leasing on the Internet? I.e. A Cloud Service Provider analysis tool. One way to do that is to ping the server in question and then take the data from these pings: SmokePing Link, this link will set up a long term analysis of your server (service).

    Here is an example of 3 pings to Yahoo, Google and websites with 86ms, 28ms, and 47ms average responses for each site.

    In the simple ping example you can see more information, which is what the SmokePing service uses at DSLreports website, which also happens to be a great resource in testing your Internet Service Provider bandwidth claims.


    What is the most important asset in your company? And in your personal life? What makes your work happen?

    And interesting to note every person has the same amount of this 'stuff', there is no such thing as Bill Gates has more than I do... Everyone has the same amount of time in the day (24 hours).

    It is what we do with our time that is most important and thus will shape us. My favorite Albert Einstein quote is: "Have the courage to take your thoughts seriously for they will shape you!"

    Why am I bringing up a philosophical argument? Because it is cloud related :).

    If you move your company to a cloud environment, with DaaS (Desktop as a Service) for example, now you will not have to think about how to upgrade a server or discuss with others about upgrading software every year or so.

    I challenge you to see if you remember any thoughts in the last year how your electric service works, and whether you have to 'upgrade' it. Sure you may think about the cost a bit, but there is not much analysis there. One may buy more efficient light bulbs etc., the IT systems you have will be in the same methodology once you move to the cloud. Since the provider (CSP - Cloud Service Provider) will be your lifeblood of the IT and your company it is important to pick the right company.


    Long View Systems has a great whitepapers section on many cloud concepts:

  • Private cloud
  • Deploying VMware and NetApp storage solutions
  • Cutting costs with Riverbed
  • EMC domain replicator
  • Consolidate your windows environment with NetApp and VMware
  • LAN/SAN consolidation with Cisco Nexus switches 5000
  • Windows 7 Desktop as a Service withCitrix XenDesktop

    These are all interesting cloud topics. especially the Cisco Nexus details, where you can drop the SAN network with the new Nexus Cisco devices.The Unified fabric switch is good for 10Gb/s with multipathing and more (good white paper to read).


  • Now that I am looking for PCI (and other) compliance cloud companies I am finding several... Here is another: layeredtech out of Texas.

    Another PCI, HIPAA and also FISMA compliant company: LayeredTech has many hosting services including providing 80% of all PCI compliance functions.

    Looks like they started in 2004 as a managed service provider (MSP) , then in 2006 some cloud hosting services were introduced. - They are #9 in the TalkinCloud100 list (2012).

    Looks like another IaaS provider. layeredtech seems to have multiple guarantees,including compliance-based managed services.


    I always wondered if a cloud company could be PCI, HIPAA, and /or SAS70 compliant, and it looks like cloud company on the TalkinCloud100 (#9) Logicworks tech specs offerings

    The PCI compliance specs are most interesting:

  • Build and maintain secure client and admin networks
  • Strong access control
  • Monitor and test networks
  • Annual compliance audits
  • Quarterly security reviews

    The HIPAA compliance specs:

  • Identify & Authenticate
  • Access Control Lists
  • Accountable
    Also what it says that a dedicated system is placed in a DMZ (or De-Militerized Zone) with managed Intrusion Detection Systems (IDS), Log management, and daily penetration scans. For all patient data.

    We did all of those things at Savvis as well, Savvis has to keep all the clients seperated and using seperate firewall and IDS systems one can place servers in many levels of security settings.


  • Found this interesting cloud offer:

    SoftLayer has a free cloud server for 1 month

    this way you can try their services On one server with one 2GHz core, 1GB of RAM, 25GB storage, 1TB of bandwidth, and one IP address.normally costs $50/month, but the first month is free.

    This company is what is called an IaaS, or Infrastructure as a server company. Where at least 25000 companies have some form of their company IT services (according to TalkinCloud).


    TalkinCloud's 20 Stock cloud index: TalkinCloud link

    It is an interesting list of companies. Including RedHat and VMWare, which are the cloud building companies (one needs their stuff to build a cloud). So I am sure there are different representations within this list. Like IaaS and DaaS hosting companies as well as the SaaS like Intuit with Quickbooks online etc.


    One of my consulting projects is to move a client (a law firm in Saint Louis area) to the cloud in the next few weeks...

    Specifically to the Cloud service provider:Uptime Systems, Once we migrate all files, email, and firm management software (Time matters - by Lexis Nexis) the attourneys will be able to access the files from anywhere on the Internet or in the country where there is a case.

    Also the importance to me is that I no longer have to perform backups, as Uptime Systems will automatically take care of that. On top of that Uptime Systems has RAID-5 systems (much more expensive than what we have RAID-1), so the servers will stay operational with less downtime.

    We are reviewing the cloud migration now, which consists of email transfers including the autocomplete function within Outlook (*.nk2 files), files on each desktop. The DB migration for the Time Matters Application. The file server shares will be the same. Favorites and desktop icons should also be set up or transferred.

    Currently we are using Rackspace's jungledisk cloud service, but it is not good enough, as jungledisk takes too long to download 8-20MB files. On Uptime's systems the files will be on a local network, the user will access their desktop with a RDP (remote desktop) session. The RDP session does not use much bandwidth and can be run from many different devices (including tablets, or even smartphones).

    This cloud service by Uptime systems is called a DaaS or desktop as a service, since one accesses a desktop on the cloud (or a server on the Internet somewhere). Uptime Systems is ranked #91 by

    We liked Uptime specifically because of their knowledge with Time Matters, and they seem to know what is going on with law firms in general.


    Cloud computing can solve your Disaster Recovery and Backup needs if set up right:

    Another great reason for going to the cloud is to have backup and automatic Disaster Recovery services.

    Since if one is on the cloud (a server on the Internet in a data center) then one can access the server as long as one is on the Internet as well. And that is possible in many locations. Not just near where you live.

    But this blog Cringely has worked on (until he retired I guess)... Cringely's blog where he discusses a company which housed other peoples data went down because of Sandy and an unexpected basement flooding around November 2012.

    Cringely brings up a good point to truly have DR and IT services on the 'cloud' one should have multiple sites housing your data, even if it costs more, as it is yet more resilient in case of emergencies. At least realize your risks with the various vendors... I.e. a vendor in Texas may have Hurricane risks. Whereas a vendor in California may have earthquake risks. So not only is it important to have an outstanding Internet pipeline and ability to connect to the vendor, but it is also important to connect to a vendor with the least amount of natural disaster risk.


    Cloud migration post from CloudTweaks blog

    The blog has set up a few steps in a cloud migration project which I have added and modified to:

    Remember to make a financial assessment of current costs and future costs, while also keeping in mind the future flexibility and new abilities of the cloud services.

    1. Prepare and map out the upcoming transformation, here emphasis should be in the people to be included which will help make IT decisions - a kind of IT migration committee.

    2. Business and Application Assessment, the goal of placing as many apps on the cloud needs to be assessed here. Keeping an open mind within the IT migration committee is most important.

    3. Vendor Selection keep the selection to a managable two or three vendors

    * Request for information, here one must decide on the primary criteria for the vendor which is important to the business (some features which one cannot live without). The direction must have been taken from step 2.
    * Gather responses and review - including setting up live demos
    * send RFQ (Request for Quote) and select vendor after contacting current vendor customers
    4.Risk and Liability mitigation: this is important to review the full risks in the migration including having a backup plan.

    5. Day-to-day Production. Review the status after mid or full year milestones. creating audits of the service is important.


    Here is an example of a cloud application:

    Adobe FormsCentral is a cloud Adobe CreatePDF application

    The old apps have to be on your desktop and cost about $300/computer.

    The cloud apps are accessed by an app installed on your device, and cost $15/month or $144/year, so you say in 2 years I will have paid for what the 'standard' application would have cost, except now you can use the app on several devices, and 5 seat prices are billed annually at $12/month. Team prices link

    The advantage of a cloud app is:

    1. You can use it on more devices than one
    2. the app is automatically upgraded and is included within the cost.
    3. Cost is reasonable when you take into account the multiple devices it can be used under, instead of having to buy multiple copies of Adobe Acrobat Creator (or its latest incarnation the XI Pro).
    4. No or little cost in paying IT service companies to install or update your applications


    2012 Top 100 Cloud Service Providers list is called the Talkin Cloud 100: talkincloud 100 link

    This is from a survey which gets started every year - add your name to it if you are a CSP(Cloud Service Provider)

    #1 Amazon Web Services
    #2 Rackspace
    #3 Savvis

    #91 Uptimesystems (a DaaS among other capabilities).


    Racemi cloud migration link

    This is an interesting company which allows you to migrate servers in a quick manner, as it maps current drives from current server file drives to a cloud server drive.

    There always has to be a good testing and security audit of the new environment before going online with the new cloud system.

    The cloud hosting companies they support is Rackspace, Amazon Web services, IBM GoGrid, and Terremark.


    RedHat's Cloud Computing Info

    This Cloud Cloud virtualization with the RedHat Operating System can be done on the Internet hosted by a dedicated server hosting contract with many different types of dedicated server / virtual server mixes. This could also be called a hybrid solution. If one is required by regulations to control the data a hybrid solution may be better for your needs.


    Some good Cloud Blogs I have been reading lately:


    A good article on Desktop as a Service (a new trend).


    An article about the consequences of PaaS provider lock-in. Open PaaS makes more sense.

    So if you were working on PaaS project consider provider lock-in issues.

    Thinking Out Cloud

    Discussing open source cloud survey of more than 600 IT professionals.



    Is a SSO or Single Sign On services company. This means that with PortalGuard you can sign on once (even with two-factor authentication --- i.e. more secure) to Google Docs/Mail, to Outlook Webmail, and to Salesforce as in this Youtube video link


    Of course you can sign on to more than three web services, it is all a matter of configuration.


    New Cloud Security webpage

    A Fixvirus webpage about Cloud Security Weblinks...


    Yes, I passed the CloudU certificate exam, with a 94%...

    It was an interesting and informative amount of information, of which about 75% I already knew, because of my 16 years in info tech.

    Here is a link about the Rackspace CloudU certificate program Rackspace link


    Cloud computing Pros and Cons


    Reduced costs--------------------------------Compliance or regulations require on-site data
    Resource sharing-----------------------------Security and privacy
    Cost based on usage-------------------------SLA’s must be very good
    New services take less time to build-------Interoperability and Portability
    More resources available when busy------Availability and Reliability


    A great Cloud Security document which has to be assimilated in today's ever changing IT environment: CSA(Cloud Security Alliance) Guidance: PDF of CSA Guide2.1

    Cloud computing Security is being discussed here, so if you are thinking of moving to the cloud, ask yourself the following; for each asset, ask the following questions:
    1. How would we be harmed if the asset became widely public and widely distributed?
    2. How would we be harmed if an employee of our cloud provider accessed the asset?
    3. How would we be harmed if the process or function were manipulated by an outsider?
    4. How would we be harmed if the process or function failed to provide expected results?
    5. How would we be harmed if the information/data were unexpectedly changed?
    6. How would we be harmed if the asset were unavailable for a period of time?

    These are vital questions to ask if thinking of going to cloud with some or all of your IT systems or applications. In some regulatory environments it may be best to keep things in house.


    Securing data on the cloud. Keep a few items in mind:

    If your data now resides on the cloud so you can access from multiple devices, don't neglect security.

    Here is an interesting snippet from article:" Gartner predicts that worldwide consumer digital storage needs will grow from 329 exabytes last year to 4.1 zettabytes in 2016".

    Obviously there is a trend here, especially with the iPhone5 coming out, and the quicker updates to mobile devices, the only thing that is certain is change will occur more frequently.

    I am keeping some of these old posts as anchors, as it is alwys good to know the Innovators Dilemma and a few older "future predictions" from the past

    Innovators Dillemma graph:

    Notice that the Disruptive technology starts out at a lower market capability than the established line, but it too will rise.

    The key is to find a market at the lower levels with the future in mind.
    Of course it is easier said than done, and depends on your budget - but hte rewards can be immense.

    See below on a recent post about more information on the book Innovators Dillema.

    Woman finds error in TurboTax
    from a story in (NBC): OHAMA, NE -- A woman recently discovered a shocking flaw with a website thousands of people use to prepare their taxes.
    Instead of taking advantage of this potential gold mine for identity thieves, she is calling attention to it to protect other taxpayers.
    Very interesting - who knows how many people were hacked by this flaw? (TonyZ 4/12/2007) webpage posts - from 2011-2012

    2009 and 2010 posts at the following page: 2009 - 2010

    Old webpage - as it looked 05/09/2007

    Symantec Hoax list: Do not propagate the hoax 'warnings'. When receiving an email to send to 10 friends check this list first.

    McAfee virus list - includes virus map'

    An excellent explanation of the Benefits of Open benefits page Just checked it out, and still a very good reference of how open source programming works ;). 03/2/2006 TonyZ

    Control/politics review:Open Source technologies allow businesses to take control of their destiny. You are not dependent on the cendor getting features installed that are important to your business.

    The Business case for Open Source

    SANS Institute CID: Consensus Intrusion Database is an excellent group that looks at many security related issues... CID gives a snapshot of current hacker / worm activity on the net.

    Anti-Virus centers that we use to control viruses.
    McAfee Virus Information Library
    This database contains information on more than 50,000 known viruses, including how they work and how to kill them.
    Symantec AntiVirus Research Center
    Symantec's security update web page - extensive resources and latest information on viruses on the Internet.

    Washington Post Chronological History of the Computer Virus