The evaluation and ultimate selection of a technology solution as materially transformative as an ERP package requires an organization to immerse itself in some in-depth self-reflection. Reflection of how it is currently operating, considering opportunities to re-engineer, optimize and standardize current business processes. Reflecting on the future. Where is the business going? What external market and regulatory pressures are mandating? What and how will we drive competitive differentiation and growth, enhancing flexibility and scalability, while driving operational excellence across our business? What new innovative technology solutions would we want to consider? Cultural implications of a change this large are a significant component that has to be considered as are the organization’s tolerance for risk and approach to technology adoption. Enterprise architecture forms the basis for delivery of business capability ensuring secure, flexible, robust platforms from which business services can be delivered. Opportunities abound in this space but will require careful evaluation and consideration to ensure that they will deliver on the business expectations and requirements.
WGroup was retained by a Fortune 500 national distributor in the oil and gas industry to perform an ERP evaluation and selection. The driver to assess a new technology solution was a strategic, multi-year plan to grow revenue and share value. However, the company’s aging, existing technology was unable to scale and accommodate the significant business model transformation required by the plan. WGroup’s mandate was to fully understand the nuances of the new business model and strategic direction and obtain a deep insight into the current technology environment. The scope encompassed
Core integration points and especially limitations of existing architecture
Help with the design and review of current operational process mapping to determine risk and opportunities for optimization and re-engineering
Ensure that integration of an ERP solution to new fleet tracking and inventory systems was maintained as a core requirement
Ensure the ability to obtain, retain and utilize as yet untapped data sources to drive actionable predictive analytics
Internal culture and operational business structure design also had to be considered and evaluated as part of the selection criteria.
Our work enabled the corporation to make a data driven, fact based decision around the optimal architecture and solution that would deliver on their outcomes, goals and objectives they required.
Learn more about ERP considerations in the following articles:
App Rationalization – Strategies for a leaner, more effective app portfolio
In our last post, we talked about what app rationalization was and how it can help your business. In this article, we’re going to discuss some specific techniques for app rationalization and why WGroup believes they are the most effective ways to make your app portfolio more cost effective and efficient.
What are the goals of app rationalization?
App rationalization is the process of assessing IT applications across the organization to identify those that should be eliminated, consolidated, optimized, or replaced. But in order to effectively streamline an app portfolio, it is important to understand what effective app rationalization looks like. App rationalization should ultimately lead to a more modern, flexible, and cost effective software development platform that can reduce overhead and leave room for growth.
In order to effectively embark on an app rationalization initiative, it is important to have a well-thought-out roadmap. At WGroup, we have identified several key strategies for effectively optimizing an application portfolio. Below are some of the most important of those strategies.
Identify low hanging fruits – Optimizing a large app portfolio can be overwhelming. It’s best to start with low hanging fruit. Look for items that are clearly unused and would be easy to remove from the portfolio. This allows the initiative to gain a footing and quickly realize benefits, encouraging further investment of time and resources in the effort.
Communicate with leadership – IT and business leadership have to understand what the app rationalization initiative is, why long used apps are being deprecated, and why they should invest in further efforts to optimize the portfolio. By communicating effectively, demonstrating value, and driving business goals, it is more likely that the initiative will be successful.
Locate consolidation opportunities – It is all too common that divisions within the company use multiple applications to perform the same task. These redundancies waste financial resources and overcomplicate support efforts. Eliminating the repeated apps can be an early way of optimizing the app portfolio.
Replace obsolete apps – For applications that are business critical but obsolete, it is important to identify newer, more effective alternatives for replacement. This will ensure that there are no gaps in service and allow the business to gain immediate gains in efficiency.
The WGroup approach
At WGroup, we have put together a framework that helps companies manage their application portfolios and transform their IT environments. This allows companies to implement these decisions using the appropriate tools and technologies and partnering with the right vendors. We work with companies to stand up relevant processes so they can use the framework to make and implement these decisions on an ongoing basis. In the past, this has allowed clients to reduce their application portfolio by 10-20%, transform legacy applications and make future decisions on tools, process and technologies. We have seen these decisions transform predominantly legacy shops into modern IT shops that are agile, cost effective and efficient.
The average company has over 500 apps, yet uses just over half that number on a daily basis1. Over time, organizations add applications and infrastructure without engaging in efforts to reduce their existing inventory. Inevitably, this leads to unused or underutilized apps that waste resources and decrease efficiency. Maintenance for these unused applications drive ever increasing costs over time and can have a dramatic effect on overall IT cost trends. Maintenance costs constrain IT budgets and crowd out new projects and innovation that can add value to the company.
This process also leads to IT accumulating a greater number of legacy applications until it ultimately becomes predominantly legacy. Once a company reaches that point, effectively managing the IT budget and evolving the organization becomes increasingly difficult. To prevent this from happening, companies must have the discipline to review their application portfolio regularly and remove outdated, obsolete, or redundant items.
What is app rationalization?
App rationalization is a critical part of taking back control of your application portfolio. It is the process of assessing IT applications across the organization to identify those that should be eliminated, consolidated, or replaced. By creating a comprehensive list of IT applications and evaluating their role within the organization, companies have a roadmap for transformation to a more efficient, lean IT enterprise. Although app rationalization is only a first step to optimizing your application portfolio, it forms the foundation for a highly scalable and dynamic IT organization.
How can app rationalization help your business?
App rationalization allows companies to deliver better service at reduced costs. It is important for IT to actively manage its applications like a portfolio. This portfolio needs to be tracked and adjusted each year and IT leaders must make critical decisions to invest, consolidate, retire or replace certain applications.
Reduced costs – It is estimated that 10% of all application spending is a result of applications that are no longer being used within the organization. Retiring applications leads to retirement of servers, storage, databases, and other costly pieces of infrastructure. The manpower needed to maintain these applications is also reduced. This can ultimately result in significantly reduced IT operations cost.
Improved service – A bloated app portfolio makes it difficult for organizations to deliver the best possible service. When resources are being wasted on unnecessary applications, more critical components inevitably suffer. App rationalization allows you to focus your resources where they’re most needed.
Room for growth – Budgets constrained by outdated applications won’t have as much room to allow your company to invest in new technologies. By pruning the application portfolio, IT leaders can free up resources to grow.
Mergers & Acquisitions – App rationalization can be a critical component of a successful merger. As two companies attempt to reconcile their varied app portfolios, there will inevitably be redundancies or pieces that don’t make sense for the new organization’s goals. By assessing both companies’ app portfolios, IT leaders can ensure they chart a unified course.
App rationalization is a critical component of IT efficiency. As these decisions are made, they will have wide reaching effects on IT operations and infrastructure. It is crucial that IT leaders understand its importance in order to build a culture that makes app rationalization a priority.
Given the immense importance of the data ecosystem, many CIOs may wonder what their role will be in ensuring that the company is positioned for data success. IT leaders need to assess their people, processes, and technology and provide the leadership that underpin these contemporary data ecosystems. This means having a clear understanding of both business goals and the technology that can help drive them. The CIO’s role is one of IT leadership and business advisement to ensure that the company uses the data ecosystem effectively.
Connect with business leaders
One of the most important roles for the CIO is serving as a connection between business leaders and the technology world. A company cannot effectively use the data ecosystem unless it has strong buy-in from business leaders. This means that the CIO must strive to show the real value of data and data-driven processes and tools. Building a coalition of partners in business and IT units is critical to ensuring that every facet of the company is using data to drive insights and innovation. The CIO must work with business leaders to motivate collaboration at all levels.
Build relationships – IT’s role is one of business support. It works to ensure that the business is using data in a way that allows employees to work more effectively and innovate. This means that the CIO must constantly build relationships both within and outside of IT. The data ecosystem should be a part of every business unit and every decision made within the company. The CIO needs to listen to the needs of the business and collaborate with other units to implement solutions that work for everyone.
Make the business case for data – Developing the infrastructure necessary for companies to fully embrace the data ecosystem means significant time and resource investments. Many business leaders will be hesitant to make significant outlays without a strong business case. It’s the responsibility of the CIO to make this case and work with business leaders to develop solutions that meet the needs of the company.
Invest in the data ecosystem
The CIO must ensure that time, resource, and cultural investments are made into making a company data-driven. The forward-thinking CIO needs to invest in IT skills and technology partners that will foster a culture that is motivated to understand the business data at deeper levels and that will be able to collaborate with business at a data-context level. IT must play a major leadership role in enabling the necessary frameworks, architectures, and governance of the data ecosystem. The CIO needs to harness core competencies in managing data-ecosystem services that consume both structured and unstructured data, providing analytics “sandboxes” that allow for exploration, hypothesis modeling, and prototyping. These new structures require agile technologies and methodologies that don’t demand “perfect” quality scrubbed data.
Shadow IT is becoming increasingly common as workers go outside of the CIO’s purview to implement solutions that meet their needs. This can cause problems for the IT department, as they must often fix technical issues and security breaches introduced by these solutions. However, the CIO cannot afford to simply pretend these outside needs do not exist. The knowledge worker is demanding self-service tools that facilitate using data environments very quickly without long lead times and eliminating the dependence on IT organizations. IT should focus on building self-service frameworks that liberate the knowledge worker, providing more independence for experimenting, data exploration, and modeling, but in a way that works with the company’s overarching technology goals.
Ensure data readiness
Today’s data-driven organizations need secure, clean, and in-context data. These are high hurdles for most IT organizations, due to a lack of data centralization and the challenges around data integration when connecting disparate structured and unstructured data sources. Implementing master-data management and other similar solutions can help organize, centralize, and clean data, ensuring greater accuracy and consistency across the business. The CIO must spearhead these initiatives, working with business leaders to collect and collate data, reducing duplicate records, and improving the overall cohesiveness of the company’s data.
Increase compatibility and connectivity
Collaboration across the enterprise is a critical element of the data-driven workplace. Technology tools and flexible infrastructure, such as cloud and mobile, have emerged and are becoming more commonplace. This allows for the connection of these complex data ecosystems to enable more natural data exploration in serving the dynamic, interdependent needs of organizations. However, it is the responsibility of the CIO to ensure that these tools are adopted and that data is cross-compatible between platforms. Ensuring that data is clean and consistently formatted requires significant oversight and governance. The IT department must help guide the business to ensure an effective, overarching strategy for data across the enterprise.
This article is an excerpt from the outstanding white paper entitled The Data Ecosystem – Becoming a Data Driven Enterprise.Click here to request the complete white paper. Readers of The Data Ecosystem – Becoming a Data Driven Enterprise learn:
Four key ingredients to develop a data strategy
How to be more business focused with IT-driven data
Three essential steps for rolling out implementation
When most consultants evaluate a client’s IT operations, they rely on benchmarks to provide a cost and performance baseline, set goals, and measure progress. But there’s a problem with this approach: It simply doesn’t work. Cost benchmarks force client data into a generic model that isn’t able to capture the unique differences in client service strategies and can’t account for service quality, performance levels, or consumption issues. These limitations ultimately lead to assessments based on invalid data that don’t help the client company meet its objectives.
Benchmarking is broken
Benchmarking suffers from several critical limitations that make it an inadequate tool for assessing a company’s IT services and measuring progress.
Generic models – Perhaps the greatest issue with benchmarking is that it relies on standardized models that aren’t fit to the unique characteristics of each organization. This means that they can’t account for differing client goals and strategies. For example, they may not accurately reflect the equally viable strategies of focusing on lowest-cost services versus managing IT as a strategic investment. Forcing data into a standardized service and cost model doesn’t align with how most companies view their IT services, which means that the benchmark results don’t mean very much.
No accounting for financial parameters – Companies have a wide range of financial options when building out their IT applications, infrastructure and services. They must decide whether to lease versus buy, capitalize versus expense, time the acquisition and manage the volume of purchases. Unfortunately, many benchmarks don’t take these parameters into account.
Out of date data – If a benchmark relies on data that is more than six months old, it may have limited validity today. In order to be effective, benchmarks must be based on fresh data that accounts for recent changes.
Not fit to client profile – Every client will have different requirements and amounts of leverage. Most benchmarks don’t provide a practical assessment of what is available within the industry as it relates to the client’s industry position and financial considerations.
A better approach
At WGroup, we believe that traditional benchmarks are problematic and have no place in IT consultations. Our approach differs in that we use data from our engagement experiences in combination with more conventional benchmark information to create a comparison between the client’s cost structure and those of our other clients. The relative subjectivity of comparing client services with those of other organizations requires us to provide as many details as possible to compare and contrast the IT services including the scope of services performed, service delivery models, service level attainment, and, if applicable, contractual terms. This provides a more comprehensive, up to date assessment that’s fit to the client’s unique needs.
A complete and uniform understanding of the client’s current performance, limitations, and challenges provides a better foundation for future planning. This allows for a more natural progression to sourcing strategy development, scope and timing of RFPs and other critical strategic decisions. With a full understanding of current capabilities, risks, constraints, and goals, it is possible to create a better roadmap for IT service development that reduces costs and delivers better results.
Key considerations for technology integration in mergers and acquisitions
There are several elements that should be considered when planning for a technology integration following a merger or acquisition. This framework will help form the foundations of an integration effort that builds on the strengths of both companies to drive the business goals of the unified whole.
Synergy – The integration should combine pieces of each company to form a more complete, more effective whole. This involves the maximization of revenue streams through embedding key products from the target company into the parent company, or vice versa. It also involves recognizing that some elements should be left segregated in order to achieve maximum cost effectiveness or efficiency. Elements that should be considered include people, operational elements, applications and services, and enabling technology.
Time to Market – As a combined entity, a variety of factors will change the time to market for products and services. Leveraging skills and resources from both companies can help expedite development, testing, and production, allowing products to be created faster and less expensively. In some cases, this can also work in the opposite direction, as integration problems can cause inefficiency and other problems.
Cost – The costs of any integration efforts will be a major component of developing effective strategies. Companies must examine the expenses of a chosen integration roadmap, as well as the savings it will allow for, in order to make better decisions about what to keep segregated and what to combine.
Innovation – Bringing together the varied talent and resources of two companies can lead to a dramatic increase in innovation. Companies may have the ability to develop new products faster, combine technologies to create more effective solutions, and benefit from an influx of ideas. However, it is also important to take steps to make sure that innovation is not stifled by incompatible culture changes or processes that aren’t effective in a new environment.
Creating an integration plan
In order to agree upon an integration model and ensure that the IT organization is ready to carry out the necessary changes demanded by that model, careful evaluation and planning is necessary.
Start with a baseline assessment
Designing an effective plan for technology integration that meets broader business goals should involve careful assessment of both companies. This assessment should identify any redundancies in people, systems, infrastructure, applications, vendors, capabilities, and costs. It should also identify opportunities to add value through integration or collaboration and look for areas in which there are gaps that need to be addressed. This process should be broken down into the examination of several distinct components for each company:
People and organization – Assessments must evaluate the skills, capabilities, and overall organizational structure of both companies. Look for overlaps in employee capabilities, potential power structure problems, and any issues that could arise from mismatched culture.
Processes – Each company has processes in place of particular maturity levels and other characteristics. Learning how to mesh the way both companies accomplish goals is a critical step in achieving a successful union.
Infrastructure & Applications – A careful inventory of each company’s applications, systems, and infrastructure must be made to look for areas of overlap and to allocate resources in the most effective way. Other issues to address include relative scalability of infrastructure, the suitability of adopting resources to new tasks, and adherence to industry best practices.
Strategic alignment & governance – During a merger or acquisition, there should be mechanisms in place to ensure each company is aligned in terms of its goals and accountability. This will help avoid conflicts of interest and problems in the power structure of the new organization.
Financials – There should be a careful analysis of IT costs by function and activity for each company. This allows the company to identify items that do not provide a positive value and to integrate in the most cost effective way.
Create timeline & project portfolio – After making an assessment of the current state and deciding on an integration model, companies must begin creating a plan for the projects necessary to accomplish those goals. This should involve a structured timeline with milestones and metrics to judge progress. This allows IT to prioritize projects according to the needs of the business and allows for a more organized approach to integration.
Address risk mitigation – Technology integration inherently involves risk. IT leaders should look at every decision in terms of the potential costs and pitfalls compared to its benefits. This allows for planning that is based on logical analysis of the facts and reduces the chances that the integration will fail due to unforeseen outcomes.
Develop a structure for execution – Planning is only half the battle. Companies must also put solid structures in place to ensure any projects that have been delegated are followed through on. This involves ongoing dedication to the integration process and requires substantial commitment on the part of IT management to ensure that early work is not undone by later mistakes or lack of will.
Why workplace loyalty isn’t what it used to be and what your company can do about it
There was a time when employees were loyal to one organization for many years, decades, or even their entire career. This was in large part due to the stigma associated with regularly switching jobs. It branded employees with a scarlet letter on their chests for being disloyal or problematic.
Today, these former notions of workplace loyalty have been thrown out of the window. Why the change? Pensions are limited, technologies and markets are continuously changing, opportunities are immense, global business is booming, and employees now have all of the leverage to do what is best to advance their personal situations. In order to stay competitive, companies must learn to adapt to this new trend and use it to their advantage.
A recent rise in job hopping
In the past few decades, average employee tenure has seen a steady decline. In fact, a Gallup study found that 60% of millennials are open to new opportunities. This has made it more challenging for companies to retain effective talent and increased the need for more effective recruiting efforts. But why is the change occurring and what does it mean for your company?
Employees seek growth – Many professionals today want more than just a comfortable salary to pay the bills and a secure job position. They want continuous development, education, and advancement. If a professional does not feel challenged or does not foresee significant climbs in the corporate ladder, they have no qualms in going elsewhere.
Employees seek flexibility – With the rise of telecommuting and mobility, along with a migration from the typical 9-5, professionals now have opportunities they may not have had access to in the past. Today, professionals, retirees, and stay-at-home parents can work without leaving their homes, start businesses simply by setting up a website, or freelance more easily than ever before. The traditional 40-hour work week at the office has significantly evolved.
Employees seek engagement and impact – A 2015 Gallup poll1 measuring employee engagement in the US found that just 32% of employees felt engaged in their jobs. Employees desire flexibility, creativity, and purpose and when their employer offers this, they are more committed to their team and the future success of the business. When employees understand their role and responsibilities, have what they need to be successful, and can see the connection between their role and the overall organizational purpose, they are less likely to explore new external opportunities.
How to deal with the shift
Studies on the cost of employee turnover predict that every time a business replaces a salaried employee, it costs 6 to 9 months’ salary on average. Therefore, it is important for companies to implement practices that are better adapted to the new trend.
Exploit the positives – Employee turnover has downsides – there is no doubt about it. There are significant costs involved in having to recruit and onboard new talent, plus there is a loss of knowledge that an employee who exits takes with them. However, there are upsides as well. Continuously bringing in new talent ensures fresh ideas and insights, preventing stagnation and potentially driving new competitive advantages.
Give valued employees incentive to stay – Today’s professionals want opportunities for growth, competitive pay, flexibility, and an attractive work-life balance. Companies that recognize and effectively address this can retain employees by giving them what they need to be happy. Be creative! Allow executives to work from home. Provide ongoing training or enable employees to temporarily explore other roles within the business. Ensure that employees understand their career ladder and what they need to do to take that next step. Create an effective, innovative workplace that encourages employees to be creative. Establish spiffs or bonuses that encourage competition.
Work with quality recruiters – A critical piece in all of this is finding and retaining top talent. Organizations should work with high quality recruiters who have specific knowledge of industries and trends. This is particularly important in IT, where specialized knowledge is critical to verify that employees have the skills and experience necessary to succeed in the workplace. By effectively managing recruiting efforts and partnering with specialized firms, companies have the ability to draw from a large pool of qualified applicants in an extremely competitive market and ensure that they remain ahead of the hiring curve.
upGrow — an affiliate of WGroup — staffs with higher standards. upGrow’s experience in working with Fortune 1000 companies has taught us to recognize the challenges among many organizations to stay competitive while building a responsive and agile technology team. To stay ahead of the curve, organizations must find the right people who can lead, support and optimize IT initiatives. Click here to discover the upGrow difference!
Blockchain: What it is and why it matters to IT and business leaders
Blockchain is data structure that makes it possible to create secure distributed digital ledger of transactions between two or more parties without the need for a central clearinghouse. Blockchain originally rose to prominence as Bitcoin’s digital ledger, recording transaction for the cryptocurrency. Its ability to store data publicly and anonymously is what gives Bitcoin its security and immense flexibility. By distributing data across a large infrastructure, it is possible to attain highly accurate, secure records. When transactions or other data are added or edited, the other nodes (parties) must evaluate the transaction and compare it to their own records, eliminating the need for central data storage or verification. Illegitimate additions are automatically identified by the system and rejected.
Why blockchain matters for business
Although blockchains were first popularized by Bitcoin, they have applications far beyond cryptocurrency. The technology’s ability to allow two or more parties to have complete confidence in transactions without the need for a centralized third party has obvious benefits in the world of business. Today, so called blockchain 2.0 technologies are being developed that will allow two parties to create sophisticated automated contracts that allow for profits to be automatically divided or payments to be automatically made after certain events. This reduces the need for escrow services and clearing houses while expediting transactions and settlements. Blockchain technology may also provide a more secure platform for transactions than traditional means, reducing fraud and necessary security infrastructure.
Some real life examples include Nasdaq, an early adopter of blockchain, is using the technology to allow private companies to issue stock and stockholders of public companies to vote their shares. Everledger is using it to create a registry of diamonds to suppress trade in “blood diamonds” from conflict zones.
Despite its significant advantages over centralized data structures, blockchains are not without their challenges. Distributing data across a wide range of nodes may provide a larger attack surface for bad actors than traditional data storage method. Although this isn’t a problem in blockchain’s original implementation, bitcoin, where all data is public and anonymous, it could be a problem in other implementations where keeping record details private is critical.
As reported by the news media, the story of Ethereum blockchain, an online ledger that records transactions and lets users trade ether, the second-most popular cryptocurrency, behind bitcoin was hacked. A key part of Ethereum allows people to put smart contracts which are written in computer code and run on Ethereum’s network. On June 17, 2016 the DAI (Decentralized Autonomous Organization) announced it was hacked using the Ethereum network. It was reported the hackers exploited a feature of DAO’s smart contracts to siphon off roughly $50 million of its members’ contributions to the fund.
Blockchain in its current form may also be challenging to implement. Because it is an open source project, it has many competing variations and standards. This may make it difficult for companies to create a single production ready solution that meets their needs. It is only in recent years that certain groups are trying to make the technology feasible for enterprise applications.
Ultimately, it’s critical that IT and business leaders stay aware of blockchain technology and investigate how it could impact their business. Its ability to store data accurately and securely without the need for a centralized service has many applications that may provide increased security and efficiency for the enterprise. Blockchains may not be perfect yet, and need further safeguards, but they undoubtedly provide a glimpse into the future of data structures and transactional records.
Standard technology recruiting practices no longer work. Here’s why… and what to do about it.
When technology hiring managers have a strategic vacancy in their organization that they need to fill, they typically kick-start their hiring process by creating a job listing that is submitted it to the human resources organization. From there, human resources will post the role externally, scan resumes and profiles for top skill keywords, briefly interview candidates with a list of canned questions, and then submit a stack of profiles for the hiring manager to review. Although this seems like a practical process, there is a problem: the vast majority of IT hiring managers that I speak with on a daily basis are not satisfied with the quality of candidates they are receiving. These standard practices lead to a flood of vastly underqualified applicants and a lengthy recruiting cycle, which in turn can lead to an underwhelming hire due to the urgency of the need. So what is wrong with IT recruiting, and how can companies improve the overall success of the hiring process?
What is wrong with IT recruiting?
While there are several key problem areas in IT recruiting, it’s not any one portion of the process that is solely to blame. Recruiters, staffing agencies, and employers can all make strides towards improving the process. Three suggestions for improvement are:
Create more realistic candidate expectations
Move the interview process towards real-life problem-solving
Enable internal teams to more effectively screen for technical aptitude
Overly ambitious job listings – In a cutting-edge, ever-evolving technology landscape, it seems that more often than not, companies are looking for a “unicorn” candidate – someone that understands outdated technologies but also recognizes trends and the future of the industry, who has the ability to lead a team and set strategy but who will also get their hands dirty, who is young and aggressive but who also has decades of experience. Creating a “kitchen sink” job posting may seem like a good idea, but it is not for a multitude of reasons. First, it can deter quality candidates who do not feel that they meet every single requirement on the list. According to a study completed by HP, the women working there only applied for promotions when they met 100% of the qualifications listed in the job posting, as opposed to men, who would apply if they met 60%.1 Workplace diversity, specifically in technology, is important across all organizations so there needs to be flexibility in job postings to provide greater inclusion of applicants. Separately, having overly ambitious job postings also encourages subpar candidates to exaggerate their skills, which can lead to lengthy periods of time allocated to the vetting of nonqualified talent.
By creating a more realistic and flexible job description, focused on the skills and responsibilities that are critical to deliver in the role, companies will be able to identify a diverse pool of qualified candidates more quickly, ensuring a quicker time to value for their team.
Ineffective recruiting processes – Evaluating resumes by skill keywords might be a good first indicator of the strength of the candidate, but it does not really tell you what a candidate can do. By giving applicants a real world problem, companies can get a better idea of true technical aptitude, how someone reacts to real business challenges, and how they communicate solutions. This can provide much greater insight into how someone will perform on the job – and it is certainly a much better indicator than a few bullet points written on a resume.
Nontechnical recruiters – Effectively screening candidates for a technical role or project requires specialized knowledge that most HR employees simply do not have. Anyone can ask questions about years of experience and tools used to check off boxes but it takes someone with technical aptitude to truly grasp the breadth of knowledge and the scope of a candidate’s experience. When candidates are not properly screened from the start, it places the burden for vetting for technical skill on the hiring manager and leads to lengthy cycles.
The problem with staffing agencies
Staffing agencies claim to take the headaches out of IT recruiting by providing companies with a pool of prescreened, quality applicants. While they do have the bandwidth to recruit and most likely have a database of people they can tap into, many do not have the technical experience to vet candidates, especially for strategic roles. Most staffing agencies employ recent college graduates, not technical leaders, so their ability to screen candidates for “quality” is minimal. This often exacerbates IT hiring problems rather than helping them and further complicates the hiring process.
In order to overcome all of the issues mentioned above, companies must partner with staffing agencies that specialize in technology recruiting. Outsourcing recruiting to an organization that truly understands technology requirements, challenges, and trends ensures mitigated risk and quicker time to value in the recruiting process. Above all else, it ensures quality candidates who can help an organization drive business outcomes.
Why Technology Consultants and IT Leadership Are Now Critical Partners to The CFO
As technology becomes a key component of practically every company’s financial strategy, CFOs are increasingly turning to internal partners and external consultants to cover gaps in their knowledge, provide meaningful insights and develop models that extrapolate future returns and benefits. Driving financially disciplined growth remains a cornerstone of the CFO’s role. In order to govern technology spend, while ensuring it is aligned to the delivery of business strategy, the CFO must remain an educated partner of the CIO. The CFO must stay abreast of new developments and work with IT leadership and external advisory organizations to ensure that the company is prepared for the future.
How is technology affecting the CFO?
Knowledge and education about emerging technologies, risk of old solutions currently in place, solutions focused on data and securing of sensitive information, emerging and evolving regulations, are all areas that impact the CFO. The role that technology is playing in these areas is increasing daily; IoT, cognitive, automation, transformational evolution of current technology platforms and service delivery models will impact technology investment levels, internal controls frameworks, and regulatory attestation processes. The CFO needs to ensure that through strong collaboration with the CIO as well as trusted external advisors they attain a consensus for how to coordinate financial and technological priorities.
Technology and risk – Technology is a fundamental part of most companies’ risk portfolios. When downtime, errors, or compliance issues can cost a company thousands or even millions of dollars, it becomes critical that the CFO understand these potential problems and how they fit into the enterprise’s overall risk management strategy.
Technology and compliance – There is now a wide range of regulations dictating the proper use of technology, particularly when it involves customer financial information, medical records, or other sensitive data. In order to ensure that the enterprise is compliant with all regulations, the CFO needs advice and insight from those with technological expertise.
Technology and revenue generation – With the rise of eCommerce, digital products, and other technology driven products, IT has become the cornerstone of many enterprise revenue models. Even companies that don’t rely on technology as their primary driver of revenue still usually depend on it for support of operations such as data analytics, marketing, and customer service. Without an understanding of how software and technology contributes to revenue within the organization, a CFO is unprepared to effectively ensure their company’s continued financial health.
Technology and cost reduction – Technology enables the enterprise to more effectively share and analyze information, help customers, and perform daily tasks. All of this ultimately translates to increased efficiency and reduced costs.
How the IT leaders and outside consultants can help
The average CFO doesn’t have a thorough understanding of developing technology, but they must still be able to incorporate IT into their overall strategy in order to make better financial decisions for their company. In order to effectively cover this gap, the CFO must look to internal and external advisors. The CIO and IT leaders are ideal partners to the CFO and can provide much needed guidance on these issues. In some cases, it may also be necessary to seek outside perspective or specialized expertise from third party consultants. By seeking this guidance and uniting with technology leadership, the CFO can help IT deliver business results and ensure that the enterprise has a cohesive and forward looking risk, compliance, and revenue strategy.
Automation and cognitive computing are dramatically shifting the way employees work and business is done. They function as the next generation of outsourcing, shifting its focus from labor arbitrage to labor itself. These tools allow tasks to be performed faster, more efficiently, and with greater accuracy than ever before, while shifting the role of human workers to more creative and interpersonal jobs.
The cognitive wave is already having a major impact on business, and its effects are only likely to become more pronounced in the coming years. As computer technology becomes more advanced and developers create tools that can perform tasks previously only doable by human employees, work and the role of the employee are shifting. Building on this and incorporating it into the businesses innovation strategy is critical to maintaining a competitive edge in the future.
Business unit needs
As an IT leader, it is important to understand how the business can use automation and cognitive computing to increase efficiency, reduce costs, improve accuracy, and deliver better customer service. This will help the department to provide better guidance to business leaders when attempting to help them reach their goals. By helping incorporate cognitive computing and automation into business units, IT can facilitate the business to function more effectively and can maintain its own key role of technology leadership within the company.
Increased speed – Automation tools can perform certain tasks faster than any human worker could. This is particularly true in areas in which computers excel, such as sorting through large volumes of data or performing mathematical operations. This means that companies have the opportunity to significantly out pace the competition, deliver better service, and increase efficiency across the organization.
Personnel redeployment – Automation can perform menial tasks better and faster than human workers. This means that companies have the opportunity to redeploy their personnel to more interesting and productive areas that can grow the business. Humans no longer have to perform the boring work of crunching numbers or performing simple password resets, they can now be reassigned to areas where they can make a more significant contribution to the enterprise.
Increasing accuracy – One of the greatest advantages of using automation tools is their ability to perform actions in a highly accurate, repeatable manner. Although in some cases there is no substitute for employee oversight, in specific roles automated tools can perform a task better and faster than any human. Reviewing large data sets, automatically troubleshooting and addressing common IT problems, and providing access to a wealth of information are all roles in which computers can outperform their human counterparts.
Better customer experience – Although customer service may be primarily associated with human connection, cognitive computing can still play a key role. Providing automated service for simple customer requests can greatly speed resolution of any problems that may arise while simultaneously reducing the workload on the human staff. This can significantly improve customer service productivity and improve brand value.
Multiplying productivity – Although many workers are afraid that automation will take their jobs, this is usually not the case. Instead, the technology acts as a productivity multiplier, helping the workforce focus on more human oriented and creative tasks, while leaving repetitive work to the machines. This can drastically improve employee productivity, allowing them to solve problems faster, and provide for greater innovation within the company.
Reduce costs – Increased productivity, reduced errors, and improved speed all translate to significant cost savings after automation has been implemented. The technology allows for an increase in the amount of work the company can perform while simultaneously decreasing the cost of that work, meaning that automation has one of most advantageous cost/benefit ratio of any IT initiative.
If you’d like to learn more about cognitive automation, request a copy of our ebook, the Cognitive Automation Primer.
The next few years are expected to be the “era of cognitive automation.” Author Israel Del Rio presents a comprehensive primer to this disruptive technology. Cognitive automation is poised to have the same impact in the 2010s as PCs in the 1980s, the Worldwide Web in the 1990s, and mobile in the 2000s.
This 70-page ebook, Cognitive Automation Primer – The World of Machine Learning, is a must-have book for anyone involved in transforming IT and leveraging breakthrough technology for business impact.
Automation introduces major changes to the workplace. There will be questions about how it will affect productivity, the workforce, and the way organizations are run. In order to prepare for the future of automation and autonomics, it is critical that organizations examine how this rapidly changing technology will affect their business model, profitability, and employees.
How will A/A affect the workforce?
Perhaps the single greatest concern about automation is its effect on the workforce. This fear has slowed the adoption of automation, as there is often great pushback from employees. Historically, automation has actually had little to no correlation with unemployment. Although German industries installed a far higher proportion of robotic equipment between 1997 and 2007 than industries in America, they actually saw less job loss in the manufacturing sector. Other data show that there is an essentially flat correlation between the percent change in use of automated systems and percent change in unemployment across countries all over the world1.
How will it impact employees?
Although automation may not lead to job loss on the whole, it can still have a significant effect on employment and work activity. Its effects can be thought of as similar to those of outsourcing. In the short term, it can lead to job losses and shifting opportunities. Those doing manual or repetitive tasks are often the first to be impacted; however the effects of automation can quickly spread to other areas. This can become especially problematic when those who are installing or maintaining systems become affected. Bringing in tools that would displace systems engineers can be challenging, as they may have a vested interested in seeing that the implementation is not successful. That’s why it’s critical that organizations understand how to properly make use of their workforce in a new, highly automated work environment.
Restructuring the workforce
Although automation can bring short-term job loss and dramatic shifts, it can also free resources to create higher level, more-rewarding jobs in other areas. As software and robotic systems displace their human counterparts, resources can be shifted to other departments. At the core of an effective automation implementation is an expert who knows how to minimize problems and ensure that human and autonomic resources are being used to their fullest extent. A robust management team also is needed to oversee external and internal services.
It is necessary, at least initially, to have staff who work alongside the automated workforce, stepping in to occasionally fix errors or address problems that are too complicated for the virtual engineers. This will, however shift as the system evolves. Autonomics allow the systems to improve their quality of service over time. Each time the systems deal with simple tasks, they amass more data in their knowledge repository, which they can later use to handle more complex problems. As virtual engineers become more sophisticated, the human workforce becomes less necessary.
WGroup has just released a new, 70-page ebook on Automation and Autonomics, entitled Cognitive Automation Primer: The World of Machine Learning. You can get your own copy of this comprehensive ebook at no charge by visiting http://www2.thinkwgroup.com/Cognitive-Automation-Primer.
CIOs Increasingly Partner with Top Executives To Increase Business Competitiveness
Technology must be a core facet of strategic business design for organizations to continue to grow, drive share and category. Technology-agnostic delivery models focused on the services delivered and business outcomes will, if deployed correctly, drive competitive advantage. The evolution of the CIO from information technology executive representing the business to business executive utilizing all components of technology to drive results, with disruptive technologies and a focus on transformation, not transaction, has never been more critical.
From technologist to technology representative and business advisor
Organizational transformation to meet the challenges of enhanced regulations, a rapidly evolving global customer base with unique generational engagement demands, and a continually morphing competitive landscape demands that a new perspective and engagement model be facilitated for the CIO. CIOs need to influence, and directly engage with the highest executive members including the board.
New customers expect technology focus
The past two decades have seen the information technology function immersed in delivering technology packages that essentially automate manual processes and deliver historical data about performance. Organizations have traditionally reported into the CFO and have been run as cost centers, with the CIO’s role focused on optimizing labor arbitrage while always seeking ways to do more with less. While this was happening the customer expectations and demands changed. Immediate gratification, personalization, an always-connected and always-on mindset, data and more data, peer influence, a social, mobile universe of opportunity have changed how consumers’ and workers’ viewpoints! In order to capitalize on this, innovative companies are realizing that their traditional approaches to technology will not work and that they must transform in order to survive in the long term. CIO engagement in this dialogue as a peer and influencer of strategy may be the biggest challenge the CIO has ever faced, but one never more critical.
Technology is now driving
An effective strategy has always been one centered around driving business value. Alignment of the technology strategy to the business strategy has traditionally been focused on enablement of an already defined business direction. One could possibly argue that this needs to be reversed with the technology strategy now being the catalyst that evolves into the business strategy.
WGroup has helped many CIOs assess and transform their IT strategic frameworks, governance structures, and operational processes to meet the sometimes competing demands of the business and emerging trends in IT. We adopt a pragmatic approach to implementing new IT capabilities that balances future needs with short-term improvements and benefits. Many IT transformations are designed to be self-funding, with subsequent phases exploiting the success of prior investments and improvements.Visit http://thinkwgroup.com/services/strategy-transformation/ to learn what we can do for your organization.
Milestone targets will help reveal true progress – or failure
Many failed projects suffer from the same tactical mistake: doubling down on failure in the hope of a different outcome. This type of behavior has been well observed under the guise of the Gambler’s Paradox—continuing to gamble in the hope of recouping losses; resulting in even greater losses. Good project management, on the other hand, entails tracking down and detecting when a project is being derailed and then planning appropriate, timely remediation or even changing direction. This is why it’s best to properly partition a project into discrete and well quantifiable milestone targets. At least in this way, if something fails, it will only affect a portion of the effort.
Focusing less on micro-managing every activity and more on enabling the timely delivery of milestones that break-down the project into specific sub-projects is a more flexible approach.
Handling issues occurring at a sub-project level is less difficult. Also, having stand-alone milestones allows for possible early delivery of actual functionality. It is even possible to create some milestones to serve as a “canary in the mine”, as long as these milestones do not create unnecessary pressures or distract from the main path of the project.
Needless to say, the key to tracking and assessing the status and impacts of a project’s dynamic is the project management function. However, there is the mistaken idea that anyone with working knowledge of MS/Project can become an IT project manager. The fact is that in the world of IT, delivery managers with systems and software knowledge and background have traditionally been more successful in the project management role. Perhaps the reason for this is because this type of project manager is more able to assess the health of a project against actual milestone deliverables rather than via traditional Gantt charts with red-yellow-green status colors. This semaphore style might look good on status reports but does little to ascertain the true project status.
When establishing the project management governance, keep in mind that there are projects and, well, there are projects. Smaller projects have to be handled in a manner that assures the necessary ingredients for success are available to the team leader with a minimum of red tape. Smaller projects can benefit greatly from rapid application development methodologies1 and from early prototyping. In addition, smaller projects better align with the iterative approach favored by Agile methodologies. Partitioning a large project into smaller areas of work also facilitates the potential testing and introduction of innovative solutions while minimizing risks. However, partitioning should not be viewed as “fragmenting”. There is science and art in the way to partition a major initiative while maintaining a holistic and integrated view of the whole. In this sense, the principles followed in Agile development (planning game, small releases, simple design, etc.) also apply to milestone-based project management.
What defines a small project?
Well, having no more than five people working on a deliverable for a maximum of four months is as good a metric as any. If the cost of this type of project exceeds the one million mark including labor, hardware, and software capital costs, then it should not be handled as something small.
Then there are the medium size endeavors. These are projects that can take up to a year, or slightly longer if the powers-that-be are willing to take Prozac and give you a bit more leeway. The core development teams for this type of project should never exceed the magic number of twelve. These projects tend to fall in the ball-park figure of around two to three million dollars. A medium size project needs to be managed with a savvy combination of high-level project management controls and the appropriate placement of deliverable milestones.
Larger projects should not even exist. Why not? Well, a large project should be properly broken down into as many small and medium sized projects as possible. Ultimately, a large project should only exist in the PowerPoint® bullets of those responsible for your company’s public relations, in the minds of marketing (“Version 3 of our Super-duper product available by Christmas!”), and in the radar of the very small team responsible for the integration of all the various moving parts.
Clearly, the failed initial launch of Healthcare.gov was an example of a humongous project that was managed on a task-oriented basis. It had only one real milestone: Go into production by October 1, 2013. We all know what happened next. The story would have been turned out a lot differently if a milestone based approach had been followed.
So, what is agile-based project management really all about?
Basically, it’s the definition and tracking of measurable, well defined milestones. Note that we are not suggesting that the need to plan for tasks goes away; just that the project status should not be measured vis-à-vis the degree of task completion or the number of hours worked on a task, but rather against the milestone’s success or lack of thereof. Milestone management is based on outcomes as compared to the actual business requirements the project is attempting to solve.
The difference between a milestone and a traditional project management event is that the milestone should stand on its own as a visible deliverable. Visible is the operating word here. Completing a piece of code is not a milestone. Completing a working prototype or demonstrating a working sub-component of the deliverable are proper milestones. An eye-candy demo is not. If the event is something that can be shown to anyone outside the core development group, and is considered to be at least a partial success, then it qualifies as a milestone.
Tracking projects via milestones has a number of benefits:
Missing a milestone might be a clear signal that the project is not on track, but the impact of the delay can be contained in a timelier fashion
A successful milestone motivates the team by providing success checkpoints
Milestone deliverables with stand-alone utility can deliver actual benefits sooner
A successful milestone helps motivate the project sponsors to continue their support of the project even when faced with budgetary constraints
There is no way to sugar-coat a missed milestone. While this should not necessarily spell gloom-and-doom, a project with two or more missed milestones is a project that needs to be seriously reviewed and revised.
To be sure, it’s always best to allow room in a large project to anticipate the failure of a specific component and to adjust for the overall delivery because of this failure. The solution to this type of quandary might range from starting again from scratch (assuming the initial implementation was proven wrong), to completely eliminating the subsystem and remediating by providing a suitable minimum set of capabilities from other subsystems.
Ultimately, agile, milestone-driven project management will be superior to the traditional task-oriented project management style.
1 NOTE: Use of Agile methodologies should apply to the actual software development process; not to the architecture and design-making processes. Using pseudo-Agile approaches under the mantra of “code now, design later” often results in failure.
Keep on eye on your inbox or check back to www.thinkwgroup.com in mid-September to get your copy of the full white paper on Agile-Based Project Management.
Big data can be monetized in three fundamental ways. These are each discussed below.
Big data can be used to improve operations, thereby reducing costs, improving efficiencies, ramping up sales, and increasing profits.
It can also be sold, licensed or shared with other organizations as a product.
It can be used to build a “multi-faceted” business, or even to launch new businesses.
The use of big data analtyics (BDA) is helping decision makers in dozens of industries. While it’s impractical to detail each one, the summaries below point out specific examples in key industries where BDA has taken a foothold and allowed businesses to extract monetary value by improving their operational efficiencies.
Manufacturers have used automation on the shop floor for decades. Today, companies are using big data produced by sensors built into manufacturing equipment to minimize outages through predictive maintenance. Such machine-produced data is expected to increase by more than 40 percent by 2020.
Healthcare operators can use BDA to map a patient’s data to the records of other patients, identifying patterns that can offer a more accurate diagnosis. BDA also helps prevent hospital readmissions that result from insufficient treatment on the initial admission, or from an incorrect initial diagnosis.
Commercial lending uses BDA to avoid risk by adding social media data to traditional risk assessment tools (credit reports, public data, etc.)
Energy companies have deployed smart meters in recent years and outfitted their distribution means (electrical transmission lines, pipelines, etc.) with sensors that give them near-instant awareness of problems. These advances reduce the cost of meter reading and of maintaining countless miles of distribution pumping stations and sub-stations.
The transportation industry has tapped big data from crowd-sourced data coming from mobile phone apps, and in some cases, from specially designed traffic monitors placed throughout a city.
While each industry uses big data differently, big data reveals more information, new information and new patterns of information that give insight that traditional warehoused data cannot. However, blending big data with traditional data stores adds context that allows better decision making. For example, the city of Boston maintains data on roads and highways that need repair. Citizens have long been able to report potholes and other obstructions via phone. The city decided its traditional phone reporting method could be enhanced with big data, so it created an Android app named “Street Bump.” The app uses the GPS and accelerometer in citizens’ phones to detect and report potholes and bumps that need attention. This crowd-sourced data is melded with existing operational data to facilitate road repairs, giving the city a richer and near real-time view of repairs needed on the city’s roads and highways. Combining existing data with big data adds the context that can impact budgeting, forecasting, and requirements for manpower and equipment.
Sell, License or Share Your Big Data
In choosing to monetize big data as a product you must give value to the buyer; it must be capable of revealing new information the buyer can can use to advance toward its business goals. It might help the buyer answer questions about risk, value of an asset or its future value. It might reveal insight into a market or customer behavior. But what gives a data product value? These are the essential ingredients:
The data may be high velocity, which means it is real-time or near real-time. Uber, the ride sharing company, gives customers the ability to find an available ride within a given radius, and tells the consumer how soon the ride will arrive. The Uber app used by customers and drivers handles both supply and demand in real time. It uses and produces high velocity data. From another direction, retailers often provide Wi-Fi in their stores so they can track the movement of patrons through the store based upon their smartphone signals.
The data product may offer greater precision than what is already available to the buyer. In today’s environment, large-scale digital systems such as mobile phone networks measure and record subscriber activity in great detail. Location, time of day, length of call and other facts become data as a subscriber moves through each day. Likewise, a web site visitor’s path through a site can be recorded in great detail. Similarly, a GPS tracking device like those often used by parents to monitor their novice teen drivers sends location, start/stop times, speed and other details to a server. Another example: Many web sites carry product reviews, however without appropriate safeguards, fake reviews can be posted to either bolster or disparage the product. Booking.com, for example, not only restricts reviews to people who are verifiable customers, it maintains quality control over each posting. As a welcome added feature, it also allows consumers to see reviews from their own demographic group: solo travelers, business travelers, families, etc. These are all examples of precise data.
It may offer greater scale. For instance, a data product that includes all data in a given population may have value to a buyer who, until now, has only been able to accumulate sampling data. A cellular operator such as Sprint or AT&T collects data with every phone call. A researcher seeking information on subscribers no longer has to work with a sample population of “N” subscribers. With all data collected, “N” becomes the entire universe of subscribers. There will be no sampling error because there’s no need to sample. The confidence interval climbs to 100 percent.
Another component one can add to the creation of a marketable data product is known as data fusion. Merging proprietary, internal data with public data or data from social media or another firm can create new insights. Data posted on Facebook and other social sites provides rich detail on a person’s interests, activities and preferences. This can be married with other data sets to create powerful new insights.
For example, Choice Point Precision Marketing (now LexisNexis Risk Solutions) maintained more than 17 billion records of individuals and businesses that scored cohorts on factors such as home ownership, a “prosperity index,” bankruptcies, Spanish-speaking and many others. Combining such information with social data, for example, can sharpen an advertiser’s focus immeasurably.
Launch a New Business
Perhaps the company that has made launching new businesses part of its DNA more visibly than anyone else is Google. (Their formation of Alphabet, Inc. as its parent company attests to the diverse interests Google has pursued). This blog post announcing Google.org’s influenza tracking initiative shows how collecting massive amounts of big data can be used to solve real world problems.
While this free service is no longer operating, it could have become a standalone business selling data. Instead, Google became a major contributor to Calico Labs “whose mission is to harness advanced technologies to increase our understanding of the biology that controls lifespan.” Staffed by scientists, Calico intends to find interventions to slow aging and counteract age-related diseases.
Build a Multi-Faceted Business
Amazon, too, has collected mountains of big data on millions of book sales. With a view into the book selling business, they launched their Kindle reader product and encouraged writers around the world to self-publish their fiction and non-fiction in Kindle format. More recently, Amazon launched ACX.com. It is “a marketplace where professional authors, agents, publishers, and other rights holders can post fallow audio book rights. At ACX, those unused audio rights will be matched with narrators, engineers, recording studios, and other producers capable of producing a finished audio book, as well as with audio book publishers.”
Not missing a beat, Amazon bought Audible.com and encourages authors to publish their audio books through Audible to earn premium commissions and royalties. Along the way, Amazon also launched WriteOn, a site where aspiring writers can post their work and receive advice from peers. Amazon seems to have wrapped most of the writing, publishing and book selling business into these various business units. In the process it has become a model of the multi-faceted business. And all of this has been driven by creative thinking salted with big data.
For companies that have accumulated substantial volumes of big data, it may sometimes be possible to “flip” the relationships that exist to create a new data product.
Consider how the Amazon Prime Video service keeps track of what you watch, what devices you register (TV, phone, computer, etc.), and then makes recommendations on what else you might enjoy watching. Such “recommendation engines” are widely used at music streaming sites, at Netflix and other ecommerce sites. They help subscribers discover new products and encourage additional sales.
However, when viewing preferences are combined with subscriber information, and further combined with demographic and economic information gleaned from list management companies, social media and other sources, the big data can be “flipped” to reveal new, marketable information.
For example, the subscriber’s home address serves as a data point that speaks to his or her income level. A service like Amazon Prime Video could identify subscribers according to their income cohort, then segment each cohort by their preferences in movies. Changing the view of that big data, “flipping it,” so to speak, gives a new view—and a new data product—that may be of interest to movie studios, producers and screenwriters.
If you’re seeking ways to harness big data — or any other disruptive technology — to deliver business breakthroughs, or even just for incremental competitive edge in your business, WGroup’s principals can help. Visithttp://thinkwgroup.com/why-wgroup/#/point-of-viewfor our unique take on rethinking IT. Or, if you’re facing pressure to transform your business right now, contact us for a consultation. There’s no obligation and you’re sure to derive at least some value from the conversation! Click here to get started.
Big data is well understood, at least in a general sense. But it’s worthwhile to note that it can include content from a wide variety of sources. Among these: content from internal company data warehouses, social media, click stream information from the company’s websites, the content of customer emails, customer online product reviews, survey responses, details of mobile phone call records, photos, videos, SMS texts, transaction data, and data produced by “Internet of Things” sensors.
SAS estimates the amount of information stored worldwide totals 2.8 zettabytes (2.8 trillion gigabytes) today, and will grow by a factor of fifty by 2020. Data centers across the planet now occupy some twenty-five square miles, equivalent to the entire land area of Syracuse, NY,  home to over a million people.
Three key trends in big data deserve your attention as gathering business intelligence from big data continues to unfold.
First, big data is derived from previously untapped sources. All these new sources of “in the moment” data can be melded with historical data to allow predictions of what is most likely to happen in the future. With appropriate predictive analytics tools, analytics becomes much more than a look at “what’s already happened.” It gives one the ability to predict what will happen next. This convergence of historical data and “now” data is what companies need to make business decisions aimed at growing profitability.
Second, there is a need for automation technology. With data flowing into an enterprise from so many directions, automation is a baseline requirement. Machines are well equipped to process huge volumes of information. People, not as well. The sheer volume, velocity and variety of big data you might choose to evaluate far outstrips a human’s ability to process.
Third, getting value from big data calls for flexible, less fragile, more adaptable systems. You might have built data stores that, after much planning, debate and discussion, specify particular formats for the data to be saved in various databases. Big data cannot be processed by systems that require information to be stored with rigid schema that need re-engineering every time a new source of data appears. Instead, analysis of big data requires you build a processing infrastructure that’s flexible and adaptable.
 http://www.sas.com/en_us/insights/analytics/big-data-analytics.html (from the video)
Digital transformation isn’t just about new technology; it’s about re-envisioning the organization as a whole. That means ensuring that the workforce is up to the task of making digital an integral part of their work life and using it to improve the business on a day-to-day basis. This doesn’t just include the IT department, but every single worker in the company. Business leaders must engage with their team and help them build the skill set necessary to succeed in the digital enterprise.
Identify deficiencies – Does the team have a product mindset and are they exceptional at collaboration, communication, problem solving, learning and troubleshooting? The most effective digital workers understand technology, how it can be used to improve the business, and are comfortable working and learning with it every day. Attempt to find areas in which team members may be lacking, and use that information to build a strategy going forward.
Train and hire – If there are areas in which the workforce is lacking, it may be necessary to make adjustments. This might include investing the necessary resources to equip staff with the skills required to harness the digital transformation. It may also require hiring new professionals to implement and execute the digital-transformation strategy.
2. Develop a digital-transformation strategy and roadmap
Undergoing digital transformation requires careful planning and deliberate execution. It is important for business leaders to be aware of both the potential opportunities and risks of implementing the strategy. This involves working with key stakeholders in the company, both on the IT and business sides, to develop a plan that meets the needs of the entire organization and end users.
An effective digital-transformation strategy:
Considers the business opportunities and aligns to the business strategic plan
Educates and communicates the changes required of all stakeholders
Pilots automation and cognitive solutions with clear success metrics
Aligns ITSM processes to account for multi-speed IT
3. Data management and governance
Data is the fabric that binds the components of digital DNA together, and good data governance is one of the core tenets of an effective digital enterprise. This should include the specification of decision rights and an accountability framework to encourage desirable behavior in the valuation, creation, storage, use, archiving, and deletion of data and information. It should also include the processes, roles, standards, and metrics that ensure the effective and efficient use of data and information in enabling an organization to achieve its goals. This provides a solid foundation on which to build a digital enterprise that uses data to make it more efficient, profitable, and competitive.
4. Relationship management
As third-party services rapidly become more important and zero-footprint IT becomes a reality, relationships become increasingly important to the digital enterprise. Relationship management spans IT, vendors, and customers; it forms the foundation of effective long-lasting partnerships. It’s critical that business leaders set clear expectations for those they work with and maintain complete transparency. That means holding vendors accountable to their SLAs and ensuring that they are actually meeting the real needs of the business and the end users.
5. Engagement model with the business
Digital DNA must be an integral part of the business and should be fully engaged with it. The digital enterprise must have technology embedded in the lines of business with a reporting structure both to the business head and the CIO. This ensures accountability and that the core focus is always on driving business goals.
Establish product owners – Each piece of the Digital DNA – including systems of insight, intelligence, engagement, protection, and record – must have a designated product owner. These product owners are accountable for both the outcomes and capabilities of their respective components. This is a very different way of thinking about traditional IT. Outcomes and capabilities foster a mindset of value – value to the business, customers, partners and shareholders.
6. Quality management
The digital enterprise is complicated and multifaceted. It is extremely important that the business leaders take steps to ensure every component is working properly and delivering the expected business results. This means conducting end-to-end testing as part of doing business. Each digital DNA component will be moving at a different velocity, and ensuring all the components work properly needs to become a core part of the company’s mission.
Disruptive trends are besetting the traditional IT organization across corporate America: Shift of IT decisions to business units, convergence of IT and business process outsourcing, cloud, social and mobile computing, and the consumerization of technology all conspire to demand a rethinking of the role of IT and of the CIO itself. There is urgency to act.
As the breadth of new technologies being developed and disruptive trends increase exponentially, it can be challenging for companies to understand these changes and how to adapt to them. At the heart of an effective digital enterprise are several key components that allow the company to leverage cutting-edge technologies and processes to drive business outcomes. Understanding this framework and how to successfully refine it to the needs of the company is key to achieving success in the digital era.
Systems of record
A company’s systems of record include core business transaction systems, including ERP systems (finance, HR, payroll, CRM, materials management, inventory, supply chain, and distribution) and record-keeping systems (financial services, healthcare, and all insurance verticals). These systems of record maintain and provide access to the key information businesses need for compliance, accounting, supply chains, and strategic planning.
The rise of information governance and MDM – As technology systems develop and IT processes adapt to be more business-focused, information governance and master data management (MDM) are becoming increasingly important. Paper records are rapidly becoming a thing of the distant past, and companies are looking for new ways to maintain and improve the accuracy and accessibility of their information. Data holds all of the digital DNA components together, yet most organizations don’t invest enough in master data management and data governance to allow the ecosystem of digital systems to interact seamlessly. This means taking new, holistic approaches that seek to address issues relating to compliance, organization, access, and retrieval. Implementing a comprehensive information-governance program supplemented by MDM can help ensure that the company’s systems of record are effective, accurate, and accessible.
How fast these systems can change – Because the data maintained on systems of record can be extremely sensitive and valuable, transitions to new technologies require care. Changes are often complex and time intensive. Business leaders should be cautious when implementing new systems of record to ensure that they will deliver the level of accuracy necessary. The release cycles for most companies will usually not exceed three a year.
Systems of engagement
Whether they be customers, employees, or partners, human beings are the driving force of every company. Systems of engagement are the interface between technology and humanity, connecting your team with business leaders, colleagues, and customers worldwide. Systems of engagement include mobile applications, SaaS tools, wearables and a wide range of other new technologies that have allowed businesses to engage more effectively.
The Cloud and mobility – Web and mobile applications facilitate interactions by allowing companies to quickly and easily reach out to customers and communicate internally. Recent statistics show that more than 10 apps are downloaded each year for every human being on the planet. Apps like Uber have already disrupted countless industries, while new technologies and innovations are likely to disrupt many more.
Similarly, SaaS applications have revolutionized sharing and collaborating internally and with partners worldwide. This makes it easier for companies to expand into new markets and source better talent, while still maintaining close, constant contact across offices.
How fast these systems can change – Systems of engagement can change much more rapidly than systems of record. Companies can easily add functionality because most of the systems rely on web or micro services, allowing companies to deliver new capabilities in as little as two or three weeks.
Systems of intelligence
At the cutting edge of digital technology are systems of intelligence. These systems include the automation, cognitive computing, smart sensors, and cloud solutions that allow companies to drive efficiencies, predictability, and accuracy across the enterprise. They represent some of the most exciting and potentially disruptive changes in the digital enterprise, but they are also the least developed.
Automation and cognitive computing – Automation is a human-productivity multiplier. It takes many of the time-consuming, repetitive, and error-prone tasks traditionally done by human workers, and allows them to be done faster and more accurately by machines. This includes simple customer service interactions, the basic assembly of manufactured goods, and the automatic repair of IT systems and services. This list will only continue to grow in coming years, as a greater number of tasks are able to be done by computers more accurately and with greater efficiency.
IoT – Systems of intelligence often extend into the realm of IoT (Internet of Things), collecting, analyzing, and acting on information and interactions that devices have in the real world. This can have significant implications for business, as companies can develop new business models and improve existing ones through more effective manufacturing, monitoring, and customer interactions.
How fast these systems can change – Although automation and cognitive computing are relatively nascent technologies, they can provide value to companies today. However, it is important for the organization to be mindful of how implementing automation technology will impact the company, its customers, and its employees. It is often necessary to gradually shift employees away from smaller tasks such as IT tickets, with automation tools acting as a supplement to the human worker, rather than a replacement. As these systems become more robust, it is likely that they will continue to allow companies to make greater improvements to their efficiency, customer service, and profitability, as well as provide an alternative for outsourcing.
Systems of insight
Systems of Insight are the tools companies need to better understand customers, optimize their operating model, and gain a competitive advantage. IoT smart sensors and applications collect data, while effective business intelligence and analytics allow companies to make better, more informed decisions.
Business intelligence and analytics – Having the right information at the right time is one of the most important elements of effective business. However, it is important to remember that it’s not enough simply to collect data. The key to powerful business intelligence is collecting the right data and extracting the most value from it. New technologies have made it easier to collect massive amounts of data, but data analysis has been relatively slow to catch up. With new innovations and better insight strategy, companies can more easily locate the most valuable insights and more effectively use them to drive business goals.
Data lakes – Data lakes provide organization-wide data management built to allow users to manipulate and analyze data across many formats and applications. This brings the power of Big Data into more areas of the company, potentially helping to improve efficiency and productivity across the enterprise. However, just because data lakes allow for easier access to data doesn’t mean that everyone within the company will have the motive or know-how to use them. This is an area in which it’s critical to ensure the required talent and training is in place before implementation.
IoT – Network connected devices also can play a significant role in a company’s insight infrastructure. By collecting real world data in areas like manufacturing lines, vehicles, and storefronts, companies can increase efficiency, reduce problems, and better meet customers’ needs.
Systems of protection
As a company undergoes a digital transformation, it becomes increasingly important that its digital assets be well-protected. Hiring key talent and investing in robust systems of protection is critical to avoiding breaches, downtime, or other damaging problems. Companies are being built on digital systems and cannot afford to have their very foundation exposed.
Proactive and defensive information security – Information security is one of the most critical elements of a risk-management strategy. Most companies are content to simply implement defensive information security systems. These might include firewalls, anti-virus software, and staff on hand to respond to breaches. However, for the digital enterprise, this is simply not enough. Companies should be proactive in their efforts to improve the security of their IT systems. This means hiring third parties to conduct full penetration testing, engaging with industry groups to identify and respond to threats, and building more robust systems of security.
Risk management – A company’s information security efforts should only be part of a broader risk-management strategy. This includes implementing systems of data backup, having offsite workplaces, and other solutions to mitigate the overall risk to their IT systems.
One of the most important considerations when adopting new cloud technology is cost. By taking steps to ensure that your organization is implementing the cloud in a way that provides a high ratio of benefits to costs, the organization can help make IT a revenue enabler that adds value to the organization.
Make a cost/benefit analysis – Whether your organization is implementing the cloud for the first time or evaluating current deployments, a good first step is making a cost/benefit analysis of the technology. How much will upfront and monthly costs be for the cloud deployment? How much will the company save in productivity, increased sales, or reduced downtime? What are the costs of alternative solutions? The answers should inform any cloud-implementation decision.
Repurpose existing investments – One of the most effective ways to reduce the cost of the cloud is to repurpose existing investments as entry points. By using an enterprise technology framework to identify what can be reused and what needs to be rebuilt, organizations can greatly decrease the financial costs of a cloud deployment. Investments made for server consolidation, ITSM, virtualization, API adoption and development, high availability improvements, and scripting automations are examples of improvements that can be applied to private-cloud deployments. They also can be a part of a hybrid deployment or integrated into a public deployment.
Negotiate agreeable terms – When working with public-cloud vendors or MSPs, it is extremely important to negotiate terms that meet the needs of the organization. Make sure that your organization has considered what maintenance is included, what the uptime guarantees are, how much support is available, and the vendor’s reliability record. They will all influence the ongoing cost of the deployment.
Evolving your cloud implementation
To ensure that your organization stays competitive and takes full advantage of the growing power of cloud, it is important to constantly evolve your cloud implementation. Below are some key questions and steps to help your organization make its use of cloud more effective.
Identify the business challenges our organization is facing.
How is IT affecting these challenges?
How effective are my cloud implementations (if any)?
How are our business stakeholders interfacing with cloud vendors without us?
What are the existing investments that can be repurposed as entry points to the cloud?
How will our internal cloud capability grow alongside our virtualized environments?
How will we start building relationships with external cloud providers?
As we begin to adopt more cloud solutions, how will it change our network architecture and data center requirements?
What cloud deployment models can be used to fill gaps, improve efficiency, and build the business?
Are the internal cloud and external environment more efficient and cost-effective?
How will we maintain talent?
How will we manage current providers and continue to build relationships with new ones?
Do we have systems in place to help us keep pace with changing technology?
How will the cloud impact how we interface with our suppliers and our customers and deliver our services?
What are the opportunities for reduced costs?
What are the opportunities for improved efficiency?
What new means of charging and paying for services does the cloud allow?
How can the cloud help our organization drive profits?
How will the cloud affect IT’s ongoing budget?
How will the cloud affect IT’s capital and operational expenditures?
New operating models
How do we manage security in the public and private cloud?
What are the greatest threats to a cloud deployment?
How do we ensure performance?
Do existing SLAs ensure uptime?
What staff do we need to maintain? What skills do they need?
How does the cloud affect enterprise architecture?
How does the cloud affect governance?
Are you looking for expert assistance in driving your cloud strategy to higher levels? WGroup’s cloud strategy consulting services could be exactly what you need. Learn more at http://thinkwgroup.com/services/cloud-strategy/.
The push to digital is a constant topic of discussion among today’s business leaders. But the road map to digital is often confusing and unclear. What is the role of the CIO in the digital enterprise? What about the IT team and the capabilities needed to infuse digital into the DNA of the organization? These and many other questions must be answered in order for an organization to become a modern digital enterprise. Understanding the foundational building blocks of cutting edge IT and how to use them to drive business goals is critical to maintaining a competitive advantage and truly becoming a digital business.
Achieving digital means enabling your innovation-minded people with the organizational structure, governance, and technologies so that functional silos are removed and the organization operates as one delivering unique and impactful customer experiences.
Finding the right balance of people, processes, and technology
Digital transformation isn’t just about buying new technologies and gadgets or hiring new talent. It’s a fundamental shift in the way a company operates on a day-to-day basis. Getting the right blend of people, processes, and technology to extract real value from a digital strategy is the most challenging aspect of the process. At the center of the push to digital are the people, mindset, and skills required to adapt to a digital world.
The new digital team needs traits that are different from what most of us are used to in today’s technology environment. People need exceptional skills in collaboration, communication, problem solving, learning, and troubleshooting to be successful in the modern workplace. The best digital people also have a product mindset and are inquisitive about new ways of doing things and new technologies. This allows them to rapidly adapt as operating models are disrupted and new strategies need to be implemented.
Digital transformations require fundamental process shifts as well. Simply implementing automation on a legacy system is not going to take advantage of the full power of the technology. Processes must be evaluated and re-engineered to fully harness automation and cognitive computing technologies as well as workforce skills. Developing and deploying a mobile application to complement an existing web application does not necessarily result in a more meaningful engagement with users or customers. It must be supported by robust processes and an effective team.
Technology is now an integral part of the business. The IT department shares accountability and responsibility for revenue, customer satisfaction, profit, and growth. Taking the ecosystem of applications and services in the enterprise, we can segment the elements into digital DNA components. Each of the digital DNA components relies on the other components, and each is connected to the others through a complex tapestry of data.
This series will help you decipher the digital enterprise and provide insights into making digital part of your organization’s DNA. Make sure to bookmark us at http://thinkwgroup.com/insights/
Many organizations are rightfully excited about the benefits of cloud, but often minimize the challenges that it can bring. In order to make good decisions about cloud deployments, companies must understand the risks.
Security – Security is one of the primary concerns when implementing a new cloud deployment, particularly in the public cloud. Entrusting critical applications and sensitive information to third parties can create great anxiety for organizations used to managing their IT infrastructure. In a recent survey, the top perceived threats were unauthorized access (63%) , account hijacking (61%) , and malicious insiders (43%). It is notable that users also fear that they cannot trust their cloud providers. 71% of respondents to a survey said that they did not think their provider would alert them if customer data were stolen, and 72% believed they wouldn’t be notified if confidential business information were stolen. One of the major sources of these fears is a lack of visibility. One recent survey found that approximately half of an enterprise’s cloud applications aren’t visible or fully accessible to the IT professionals on staff. This obviously can create significant trust issues, causing many organizations to be wary of cloud solutions.
Although there are unique security risks to using a public cloud implementation, in reality it is often much safer to use the public cloud than on-premises IT. Tech leaders like Amazon and Microsoft have significant budgets, with experienced professionals working to ensure that their clients’ data and applications are safe. In all likelihood, the IT security branch of these providers is significantly better funded and more experienced than those of their clients. However, it is still important to understand the security risks at play and carefully vet cloud providers to ensure that they are a proper fit for your organization and applications.
Financial – Cloud technology can help turn IT from a major expense to a significant profit enabler. By reducing costs through outsourcing much of the IT department to third parties and reducing the need to invest in expensive infrastructure, organizations can improve their overall profitability. Other companies are leveraging the cloud by using data collection and analysis, automation, and other new technologies to reduce expenditure and increase revenues. It is critical to stay abreast of current trends in order to maximize revenue and productivity and minimize expenses.
Still, many organizations remain reticent about new cloud technologies and are unsure what their real financial impact will be. IT professionals and executives need a solid framework upon which to base their cloud decisions in order to ensure that they are maximizing cost-effectiveness and getting the most from their cloud services.
Understanding the changing landscape – Cloud growth, automation, and other technological changes are shaping the way business is done, products are sold, and IT is managed. CIOs need to cope with this changing landscape by reallocating resources, finding personnel with the right skills, and reducing redundant staff and infrastructure. Understanding what actually needs to be done can be challenging, particularly if executives have limited experience working with new cloud technology. In order to make the right choices, organizations need to fully understand what these new innovations can actually do for the company, how they work, and how they should be managed. This requires a deep understanding of the technology and the marketplace.
Managing cloud contracts – One of the most challenging aspects of implementing the cloud in an organization is managing vendor contracts. Dealing with a range of cloud providers offering vastly different services and guarantees requires a comprehensive understanding of how the contracts are structured and what an organization really needs from the provider. Organizations must ensure that the provider complies with all local, applicable laws, that they have necessary control over any cryptographic keys used, that the provider has been recently audited, and many other similar details that might be overlooked by those who do not fully understand the current environment.
Getting the big picture – In order to manage all of these concerns, each organization needs a framework to understand where they are and where they’re going. With the shifting dynamics of the evolving IT world, businesses need comprehensive situational awareness. This allows them to understand their needs and how new technologies can help them stay competitive.
Evaluating your current state
In order to gain situational awareness and make better decisions when implementing the cloud into your organization, it is important to examine how others in your industry are being impacted by the cloud, your current state in terms of cloud consumption, and your needs. Organizations must ask themselves a number of questions.
What are the size, growth, and financial state of our organization?
How would our organization benefit from the cloud?
What cloud has our organization already adopted?
What cloud components would our organization like to implement?
Why is our business using cloud?
What cloud solutions are our competitors implementing?
How mature is our enterprise architecture and governance?
Where are there gaps between business expectations and what the cloud is delivering?
Do we have a cloud strategy to help stay competitive and reduce costs?
What are our performance needs?
Which applications can be safely moved to the cloud, and which must be kept on-premises?
Can the public cloud deliver the same or better performance as on-premises solutions?
Security and compliance
Does our company store sensitive information?
Which applications are mission-critical and need to exist in an extremely secure environment?
Can cloud vendors deliver the same or better security as the IT team?
What compliance regulations are relevant to our organization?
Do our current cloud vendors meet our cloud needs?
What are the contents of our cloud contracts?
Is our organization effectively managing our cloud vendors?
Making the move to cloud
Will the cloud be cost-effective?
Do we have the knowledge and staff to maximize the effectiveness of our cloud implementation?
What can benefit from being moved to the cloud?
What deployment model is right for our company? Will the flexibility and ease of public cloud be right, or do we need on-premises private cloud? Is a hybrid solution the best option?
What changes in governance, architecture and process are required to be effective?
When moving to a cloud environment, whether you manage your own cloud environment or use a cloud-service provider, there should be no difference between your IT management of your existing SLAs and associated systems. If anything, you should expect more from a cloud service provider, then what you may have already implemented in-house.
For example, imagine you have applied discipline to the management team of what was once a loose process consisting of one meeting a year, which usually resulted in priorities shifting based on squeaky wheels. You now run quarterly cross-functional meetings to review requirements and are holding to the priorities. In accordance with this, you are establishing a project management office (PMO). Your intent is to make the PMO not overly burdensome. Is the PMO standing on its own to help execute the individual projects on a transaction-by-transaction basis or squeaky wheel basis? Is the portfolio aligned to the business strategy? Who in the organization is ensuring the “right” technology and architecture for what the business needs? A solid PMO is a good part of the puzzle and needed for project (and larger program level) governance and execution, but without a mature enterprise-architecture function in the organization, are they the right projects? Are they ensuring the organization is making the right choices for technology to meet the business drivers and requirements, while ensuring technical risk is mitigated and stability is maintained or improved?
As part of this scenario, imagine you have many large-scale infrastructure transformation initiatives planned and some are in flight, so having a PMO to govern and manage them is a good thing. You have a roadmap to transform your infrastructure with a “rolling thunder” approach that will take at least three years and cost about $30 million. Your CFO and board are already aware and support it (even though your board may not know what the core technology is, and how it will solve the most pressing issues for the business and by when).
The strategic steps you have in mind include assessing the current state, developing the future vision (leveraging cloud and new technologies), developing the roadmap to achieve the vision, and executing. Today you are in the process of assessing your infrastructure, as a starting point. In developing your future vision you have a number of items that are known considerations, in-flight initiatives, and challenges you are facing – your enterprise requirements. They include the following hypothetical activities:
June 2016 is a milestone month. Your Enterprise Microsoft contract is up for renewal.
You plan to upgrade to JDE EnterpriseOne (your options in this regard are limited).
As has been planned, your JDEOne ERP implementation is likely to start.
As you look at renewal options you are considering migrating to Microsoft Office 365, to potentially reduce costs and administrative burden. This analysis should also start to consider integration of other capabilities from Microsoft SharePoint for collaboration, Microsoft Dynamics as a potential alternative for JDEOne, and as you are growing your B2B model and business you are possibly considering a CRM solution, which Dynamics provides as well, vs what you may have in house today.
Another potential integrated solution to investigate is Microsoft Azure. You may be able to further reduce application licensing and administrative burden, and reduce risks by going with a hosted Microsoft cloud offering through Azure.
Then, you start thinking. Today you are generally on-prem for about 50 enterprise applications. Should this be moved to co-lo or cloud? A benefit of moving away from on-prem will enable you to have resources focused on core activities. But what applications and workloads should be moved and can be moved? What are the risks?
Given the intended moves to MS and the JDE ERP installation you should be able to consolidate applications that will reduce resource requirements.
You are a VMware shop and are your IT staff is very comfortable with this. There is no issue here or compelling reason to change this.
Then as you get further into the analysis, you realize that moving applications into the cloud has network implications that need to be considered. Have you planned for this?
You also have apps that hold core systems of record in your datacenter on legacy systems (e.g. AS/400), that may not be the best candidates to re-platform (for interoperability reasons) or move off-premise (due to data privacy issues).
You’ve given a large portion of the responsibility for addressing and solving the above challenges to your IT operations manager, while maintaining, running, and operating your existing daily demands on IT.
You have heard (rightly or wrongly) that the cloud can be your saving grace to solve these challenges.
Your expectations for taking advantage of the cloud for the future minimally include lower HW investment, lower staffing requirements, and more flexibility and scale.
To further complicate the above, you may now also have existing applications or systems that are undergoing or in need of overhaul because they are not yet ready to meet new regulatory mandates. Your data may be at risk of exposure or not properly protected.
The above is a typical example of what all organizations are facing today. The challenges and obstacles are oriented on people, process, and technology. Introducing what can be perceived as disruptive technology can create additional obstructions for the business and your IT staff.
Before moving workloads to a vendor-hosted cloud, you need evidence that the vendor is already meeting regulatory standards (e.g., HIPAA, PCI-DSS, FedRAMP, FISMA) for organiations similar to yours.
As data proliferates, there is increasing improvement to standards that deal specifically with governance and management of data and information security, including the identification of risks and the implementation of security controls to address these risks. The ISO/IEC 27000-series is the most widely recognized and applied set of standards relating to the security of ICT systems.
The core standards are 27001 and 27002, with 27001 containing the requirements related to an information security management system, and 27002 describing a series of controls that address specific aspects of the information-security management system.
ISO 27001 is an advisory standard that is meant to be interpreted and applied to all types and sizes of organizations, according to the particular information security risks they face. In practice, this flexibility gives users a lot of latitude to adopt the detailed information-security controls that make sense to them, but can make compliance testing more complex than some other formal certification schemes.
ISO 27002 is a collection of security controls (often referred to as best practices) that are used as a security standard. Assuming that the design and/or operation of a cloud service provider’s information security management systems are consistent with the standard (e.g., there are no notable gaps) it can be asserted that their environment is compliant with the standard.
The 27001 and 27002 standards apply generally to the operation of ICT systems. ISO 27017 and ISO 27018 are two new standards under development that describe the application of 27002 to cloud computing. ISO 27017 deals with the application of the ISO 27002 specification to the use of cloud services and to the provision of cloud services. ISO 27018 deals with the application of 27002 to the handling of personally identifiable information (PII) in cloud computing, sometimes described as dealing with privacy in cloud computing.
At a minimum, cloud-service customers are advised to look for providers that conform to the ISO 27002 standard for information systems security. This is not necessarily specific to cloud computing, but the principles can still be usefully applied to the provision of cloud services (i.e, as a measure of maturity and as a necessary safeguard of doing “the right things” in an IT organization). A cloud-service provider can assert on its own behalf as to its compliance with a standard, but having an independent/qualified third-party certify compliance is a notably stronger form of attestation.
In addition, customers are advised to check whether their cloud-service provider conforms to ISO 27017 and ISO 27018, standards, since they are specific to cloud computing for information security and for the handling of PII, respectively.
WGroup is your preferred and chosen advisory partner to ensure that effective governance, risk, and compliance processes exist. If they don’t, we’ll show you how to implement and deploy them. However, this is just the first step. We are here to help you through the analysis of choices and architectural decisions you will need to make, with critical input from your team. We’ll help you adapt the leading and best practices implemented by those who have made this journey.
WGroup’s vision and capabilities align with the Cloud Standards Customer Council’s 10 steps to help your organization ensure success for secure cloud computing.
Ensure effective governance, risk, and compliance processes exist.
Audit operational and business processes.
Manage people, roles and identities.
Ensure proper protection of data and information.
Enforce privacy policies.
Assess the security provisions for cloud applications.
Ensure cloud networks and connections are secure.
Evaluate security controls on physical infrastructure and facilities.
Manage security terms in the cloud SLA.
Understand the security requirements of the exit process.
Taking a holistic approach to your challenges and in-flight initiatives, WGroup develops a strategy with you and your team. We meet your most pressing needs, but also align these to next steps in meeting your business strategy. In the most cost-effective and safest approach possible, we bring higher standards to your organization through service-provider capabilities and management.
Are you looking for expert assistance in driving your cloud strategy to higher levels? WGroup’s cloud strategy consulting services could be exactly what you need. Learn more athttp://thinkwgroup.com/services/cloud-strategy/.
An ever-expanding array of cloud applications and services is available. SaaS, IaaS, PaaS, private cloud, hybrid cloud, and other solutions offer unique opportunities and challenges for businesses. Organizations need to understand this wide range of options and determine which choices fit their needs.
Multiplying XaaS options – Each organization has unique cloud needs, and public-cloud providers are offering a growing range of options to meet them. Although many non-IT focused organizations may use pre-packaged SaaS solutions, others are leveraging more flexible offerings to fully or partially outsource internal infrastructure. IaaS has seen accelerated growth in recent years, with worldwide spending having increased by more than 30% in 2015. PaaS also offers another option for organizations looking for an environment to develop and customize applications.
Although these have been the standard XaaS options for several years, many cloud providers are offering an increasing range of service options. Storage as a Service, Communications as a Service, Network as a Service, and Disaster Recovery as a Service are all now common options for business. Another new service that is expected to increase in popularity in the coming years is Big Data as a Service, with the total big data market projected to reach almost $90 billion by 2021. These offerings allow companies to leverage powerful servers to collect and analyze data more cost efficiently and flexibly than would often be possible in-house.
Public, private and hybrid cloud – XaaS options exist in the realm of the public cloud, wherein many organizations share computing resources in a generic third-party-owned solution. This provides many benefits, such as economies of scale, flexibility, and reduced need for maintenance. However, some organizations find they need to create their own private cloud using proprietary servers. Many use a public deployment option in conjunction with in-house infrastructure in a hybrid cloud configuration. This allows them to take advantage of the flexibility and cost savings of the public cloud for certain applications, while keeping other applications on-premises for compliance or other reasons. This hybrid solution can offer many new opportunities for businesses that can’t fully outsource to the public cloud, allowing them to benefit from cost savings and get products to market faster.
Rising influence of MSPs – Another increasingly important option in the cloud arena is the managed service providers (MSP). These providers let organizations outsource their IT operations, providing security, maintenance, monitoring, and other services. Although these companies began managing servers for organizations remotely, many have grown to offer their own or third-party cloud services to customers. They can provide fully managed hybrid implementations, often including mobile device management. This can offer an attractive solution for small- to medium-sized businesses (SMBs) without the human resources to manage their operations in-house.
This is part 3 of a 4-part series on advice for new CIOs
CIO-to-CIO advice: Overcoming challenges
As a new CIO, it is critical that you understand the culture of the IT organization and how to effectively influence change. In order to overcome difficulties and carry out long-term strategies, you must establish your own credibility and provide IT leadership that impacts real, positive change in the organization.
The best leaders don’t change the entire organization before they truly understand it.
Be patient. Take time to observe first hand how the business operates.
Gain a point of reference by which to discuss problems and strategies in a relevant way and implement more effective solutions.
Outdated and misaligned IT strategies can cause inefficiency and unhappiness in the organization.
Look for deficiencies in security, vendor management, and business alignment.
Regularly assess the progress and success of projects to revise strategies and set a better course for the organization.
Improve service performance
Delivering secure, effective services is at the core of IT’s mission, but many CIOs face underperforming infrastructure and suppliers.
Review the service -delivery model and assess supplier constraints.
Initiate a routine process of developing service-improvement plans tied to personal objectives and SLAs.
Maintaining the day-to-day operations of the business while meeting increasing demands for new applications and services can be difficult.
Use what you have learned from talking to people within the company to better understand the business’s needs.
Create an executive governance committee composed of IT and business leadership to mutually prioritize these demands and ensure that efforts are being best allocated to meet the needs created by shifts in business and peak periods.
Fill talent deficiencies
In your first months as CIO, you will find areas where your organization lacks the necessary skills or leadership to maximize productivity, creativity, and efficiency.
Establish key consultancy partnerships for guidance and to help fill temporary gaps in the resource plan while seeking long-term talent.