Ain’t technology wonderful?
How many of us in the current workforce remember a time before IT systems automated so much of our workflow? There was a time when most professional jobs required many hours of tedious manual data manipulation, manual calculations, and plenty of deductive reasoning in order to produce anything meaningful. Now, when IT is appropriately applied, those tasks previously achieved slowly and manually, are completed almost automatically, within a fraction of the time it used to take. We can accomplish so much more than before the advent of modern technological solutions.
WHAT? What’s that you say? Only when it is working correctly? Are you telling me that your IT systems are not always up and running 100%? What happens when systems go down or when interactive software is not synchronizing correctly? For those of us who remember the dinosaur days, we may at these times long for the days of old when our productivity was not dependent on IT. But of course, we understand and realize that we really do not want to return to the days when information was so difficult to obtain and collate. We just want our IT to be reliable and not experience downtime. We just want systems to work smoothly. Is this too much to hope for?
Well, if you have given up and accepted that poor IT reliability will always be a present source of frustration, you may be pleased to learn that reliable systems DO exist! They can be designed and deployed to ensure minimal downtime, enabling peak operating efficiency and a greater focus on the business. This should be expected when IT systems are strategically aligned with business goals and objectives, and the different IT components designed to work in unison. Imagine a world like that!
Below is a list of some of the commonly occurring destroyers of productivity that are caused by poorly designed and maintained IT systems. The good news is that for each one of these problems (and many more), there are solutions available that can prevent these problems from occurring in the future.
To remain competitive in the marketplace, IT tools (software) must be up to date, having the key features that support competitive performance. Outdated software can also cause users to implement inefficient “workarounds” to avoid system lock-up or other problems. Upgrades can be costly and result in downtime or temporary loss of performance for re-training.
To offset this risk, a strategic approach to IT system development (i.e. software selection, configuration, utility) is needed, and best deployed during the development of the overall business strategy. Such an approach requires input from IT industry experts who remain aware of technology development trends.
Loss of Data
Data loss can occur for a variety of reasons, and with wide-ranging impact. Individual files can be lost due to human error and local storage misuse. A much larger data loss problem can happen with local hard drive failure, combined with a lack of adequate real-time backup. Lack of data management security combined with human error (or sabotage) can result in massive data loss.
Fortunately, the risk of losing data can be minimized. Controls can be established to limit the risk of loss events and mitigation prepared to prevent escalation of the problems if a data loss event occurs.
Problems with slow connections can result from communications hardware such as a switch, firewall, or router. The problem could be with your ISP, or you could have too many bandwidth hogging apps open. There is a wide variety of causes for slow internet connections. A good IT MSP can provide a systematic check of your systems to isolate the reason(s) for the slow connection, and quickly repair the problem or recommend an update to equipment, provider or software if the problem turns out to be systematic. The IT MSP can also provide training to your staff to teach them how to look for and correct some of the more basic and simple to resolve problems.
Poor Problem Resolution
When problems arise, which cause system downtime or simply inefficient workflow, a prompt response is necessary to keep focused on business objectives. Poor or slow problem resolution can arise for a variety of reasons. Help desk support can range from stellar to ‘don’t bother’. When a problem arises, is the root cause of the problem identified and a permanent fix applied?
A proactive approach is best. With the identification and monitoring of relevant performance metrics, many problems can be avoided before any serious issues arise. Are you receiving this level of support from your IT service provider?
Redundant Data Entry
There are many potential problems with redundant data entry, and the need for doing it has varied causes. When systems are not well integrated, you may be required to enter the same data manually in multiple storage locations. In addition to wasted time re-entering the data, redundant and manual entry increases the opportunity for inconsistency between the different data sets. Ideally, the data entry systems would be integrated so that data is entered only one time into a single data repository for access by all of the software using the data. Secondarily, some level of control over data entry is necessary to ensure quality data entry.
As discussed above, incompatible software can lead to redundant data entry, and ultimately to the dreaded ‘GIGO’ syndrome. In addition to the redundant data entry issue, incompatible software can lead to a significant loss in productivity. When different software outputs are not easily related to one another, conclusive evidence of correlated trends may not be clear and can be easily misinterpreted.
Given the widespread use of web-based applications, compatibility is also affected by the browsers that the software will run on correctly. The same can be said about operating systems. For example, most Apple software is incompatible with Android software. There may be situations where the software needs to be compatible across different networks, which can produce a different set of challenges.
In spite of the different compatibility risks, solutions can be determined with success assuming you have addressed IT issues strategically with the entire system considered during the design phase.
The direct cost of downtime is obvious, considering such factors as time-related costs, e.g. labor or professional services. Perhaps less apparent is the indirect cost imposed due to downtime. For example, lost opportunities. If time is spent inefficiently, the workforce has less time available to apply to revenue-generating activities. This is often reflected in reduced sales volumes. Employee engagement can also be harmed when downtime events occur, as these experiences often leave the employee feeling under-motivated to deliver and frustrated when their well-intention efforts are sabotaged by unreliable systems.
Reputation with your customer base can also suffer when deliveries are delayed due to system downtime.
Software Output Does Not Match Business Needs
Many times a company changes strategies to accommodate a change in the markets and have changes to the IT system that do not keep pace. During these periods of change, employees often are forced to make do with resources that are in place. Individual changes that may occur in small steps can become huge changes taken together over time. Upon direct comparison of software outputs vs. what the business needs, it can become apparent that the IT systems in place are no longer compatible with the business strategy or operational demands. These effects are magnified when the market is changing and the competition has made the investments to ensure their IT solutions keep pace. While the cost of upgrading may seem daunting, the cost of continuing to use software not fit for your purpose is great. Upgrade costs can be controlled and minimized through the use of a competent IT MSP.
Inadequate Disaster Recovery
Contrary to popular belief, natural disasters (e.g. flood, hurricane, etc.) are not the primary cause of IT disaster events. The majority are caused by human error. Unless you can say your staff and other support operate mistake free, you are vulnerable to data loss. Also, you should consider that a massive loss of data or data/information access can be catastrophic to a business within a very short time frame. According to Small Business Trends, 60% of SMBs that lose their data will shut down within 6 months.
Many believe that disaster recovery solutions are an expensive luxury, and leave their destination to fate, forgoing the expense of installing and maintaining disaster recovery solutions. This thinking is loaded with a risk that bears potentially catastrophic consequences. The cost of installation and maintenance of an adequate disaster recovery system is less expensive than it used to be, due to the development of virtualization and replication solutions most of which include a cloud-based component. If you do not maintain adequate disaster recovery capabilities, you owe it to yourself to assess the benefits and cost of implementing a DR system, before continuing to operate without protection.
Malware poses a significant threat to all businesses that have any internet connections to the outside world. Generally, no business is immune to this threat. Malware attacks computers in a variety of forms, including:
- Viruses – self-replicating, it can be difficult to remove
- Trojans – does not self-replicate, but can enter your computers in sophisticated ways, which may be difficult to detect
- Worms – can self-replicate, and also spread to other computers via accessing contact lists, etc.
- Spyware – collects sensitive info from your computer (e.g. credit info, etc.)
- Adware – displays unwanted/unsolicited ads on your computer and can completely lockup the computers operation
- Ransomware – scrambles data on your device (computer, smartphone, etc.), creating an encrypted system, holding the data and information hostage until the victim makes a payment of some sort to the attacker
There is no one software package available that can detect and prevent 100% of the malware in existence. New sources of malware are generated continuously. Existing malware protection software has to be continuously updated as new threats are identified. Because of a necessary time gap between new malware identification and malware protection software updates, there is not a 100% risk free solution. This heightens the need to maintain a good data backup and recovery system, one that is independent from the primary data storage.