Many IT organizations rely on the Information Technology Infrastructure Library (ITIL) framework for service management and operations. Over the years the ITIL has proven a useful set of practices for IT Service Management (ITSM) and for aligning IT investments and operations with business goals. Among its benefits, advocates and practitioners of ITIL point to increased reliability, uptime, and predictable costs.
ITIL fundamentally concerns itself with IT services; that is, the functions and processes that the IT
organization provides to the business. A service is something—an application, a set of applications,
information, people—that a business user consumes in order to perform a business function. In
general, the cloud as a technology does not change the goals of ITIL; however, the cloud can
dramatically change how services are delivered, as we have shown.
ITIL consists of five key strategic areas:
IT Service Strategy ITIL’s Service Strategy provides a set of frameworks for determining what
services are delivered, how their value is measured, how to measure cost and provide a measure
of return on investment (ROI), and how to manage the IT relationship with its business partners.
Earlier in this chapter, we described how to set up a strategy effort that defines the overall goals—
technical, financial, and organizational—of the cloud migration effort.
IT Service Design In IT Service Design, design of processes and how they relate to one another,
service level agreements (SLA), capacity and availability management, business continuity
management, security, and supplier management are covered. We discussed these topics as well
earlier in this chapter; patterns for backup and business continuity are provided in Appendix B.
IT Service Design also notes the need for a service catalog, of which the portfolio management
and configuration management systems described earlier are key parts.
IT Service Transition Service Transition governs how services are delivered and deployed. Such
areas as change management, release and deployment management, and service evaluation are
typically part of the transition phase. The goal, of course, is that new services and changes to
existing services are deployed with minimum impact to the overall IT ecosystem.
Whereas the structure of service transition remains the same, the actual tasks when deploying a
service to the cloud change significantly as we have described. In particular, the emergence of
DevOps and its associated methodologies means that the processes and tools associated with
deployment are new and different. In addition, IT departments might want to think about such
areas as SLA measurement differently, considering that there might be additional latency to the
cloud, for example.
Similarly, IT departments should set up a test cloud environment mirroring the production
environment in order to allow user acceptance testing (UAT), load and penetration testing, and
integration testing with other applications prior to full production deployment.
IT Service Operation Service Management covers the management and monitoring of services,
and how issues are managed and resolved. Key to the Service Management component is the
notion of a Service Desk, the primary point of contact for service incidents and events. The service
desk as well as the call center and help desk, if separate, will need to be trained to support cloudbased
IT Continual Service Improvement In Continual Service Improvement (CSI), IT personnel and
business teams work together to ensure services can quickly meet new and emerging business
requirements. CSI is heavily data driven and relies upon operational statistics as well as business
insights to determine where focus should be placed.
In general, cloud migration will force organizations to change some of the mechanisms and processes
by which they implement ITIL, although the basic structure of ITIL is generally technologyindependent.
However, organizations should also consider how to extend their own processes to
be more agile than ITIL might suggest; given that experimentation and prototyping (as we have
discussed earlier) are quick, think about how to do them as part of the strategy and design phases
Source of Information : Microsoft Enterprise Cloud Strategy
Tuesday, February 28, 2017 | 0 Comments
Most IT organizations have a program management office (PMO) of one form or another. The PMO’s
function is to ensure that changes entail minimal risk and disruption to the IT function. In moving to
the cloud, the PMO will need to manage new change management functions. These include the
Operational readiness, to ensure the operations or DevOps function is ready to manage a cloudresident
User readiness, in the case of functional changes to applications
Organizational readiness, to ensure (for example) that dependent applications continue to
function and that all security, compliance, and financial requirements are completed
Application and ecosystem readiness, to ensure applications moving to the cloud and applications
that are remaining but integrated with the cloud applications are fully tested and ready, and that
all issues are known in advance.
There are other aspects of governance (for example, supplier management). However, it should now
be clear that governance in the cloud, by and large, extends existing functions, and professionals in
each of these areas should consider the impact of cloud applications to their space.
Migrating applications to the cloud is an important and significant activity, requiring changes to how
both businesses and IT operate. In this chapter, we have described how to form and use a Cloud
Strategy Team to drive the migration; how to involve the many organizational stakeholders; how to
prioritize application migration; and how to extend existing governance activities. Of equal importance are the transformational aspects of the cloud, which should be examined and performed concurrently with migration. In Chapter 5, we outline what we mean by transformative innovation and the opportunities afforded by the cloud.
Source of Information : Microsoft Enterprise Cloud Strategy
Monday, February 27, 2017 | 0 Comments
We have already described the fairly significant changes to IT finance implied by the cloud: the change from a capital expense model to an operational expense, or subscription model. Financial
governance ensures that the financial changes are managed in a methodical and predictable fashion,
Actual cloud costs are in line with predicted cloud costs
Capital expenses are declining in line with expectations
Cloud billing is consolidated and no “rogue” credit card accounts are allowed
Appropriate chargeback mechanisms are created or extended to support cloud computing
Quarterly or annual budgeting shows the appropriate changes
Reporting systems accurately reflect current spend on IT
Source of Information : Microsoft Enterprise Cloud Strategy
Sunday, February 26, 2017 | 0 Comments
Data governance in IT has long been a critical function, and its essential nature existed long before the cloud. Creating and ensuring adherence to common data models, providing extensibility where
needed, managing changes, ensuring regular and controlled taxonomy updates, use of master and
reference data, data classification, formal processes around data retention and destruction: all of these
activities have been part of the IT governance function for decades.
The cloud, however, adds some new dimensions to data governance. First, many countries have, or are developing, laws governing where information about their citizens can reside. This is called data
sovereignty, and the concerns are, in a nutshell, that if your data leaves your country, it will be easier
for foreign government agencies to obtain it. To be clear, as of this writing the legal elements of data
sovereignty are still evolving, but nevertheless it is important to design an appropriate governance
To address these concerns, know what data your applications are keeping in the cloud, and know as
well what the laws of your country are with regard to data sovereignty. Some potential actions you
might want to take include the following:
Not placing any individual or customer data in the cloud
Encrypting key PII such as email addresses or physical addresses prior to moving data to the cloud
Disabling geo-replication to other geographies
Much has been written on this topic and (as we have mentioned) the laws continue to evolve, so it is a
best practice to stay on top of the emerging legislation and case studies.
Now that applications are in the cloud, additionally, it is far easier to access Internet-resident data
sources. Many governments now place very large quantities of data in the cloud (in the United States,
for example, at http://www.data.gov; in the United Kingdom, http://www.data.gov.uk;
http://data.gouv.fr in France; and so on). Other companies make data available over the web for a fee,
and such data, for example, can augment or even replace on-premises master data sources or can
provide additional marketing insights. Use of such data, including analyses of their semantics to
ensure alignment with enterprise data sources, should be governed by the data governance
Source of Information : Microsoft Enterprise Cloud Strategy
Saturday, February 25, 2017 | 0 Comments
When Microsoft IT began its cloud migration journey in 2009, it followed a similar process. First, it
cataloged its operating system instances and application workloads. This assessment included both
quantitative data that was mostly retrievable by tools as well as qualitative data that was partially
retrievable by tools and also required examination by both the operations team and the business
liaison team. This latter category of metadata included relationships, dependencies, and integration
Microsoft IT then identified the initial prioritized eligible operating system instances and initial
prioritized eligible workload/applications. Next, these initial migration candidates were reduced by
removing any business-critical systems, which would be moved after more experience had been
gained. Then, this initial list of candidates was prioritized and sequenced with less-complex
applications placed before more-complex applications, and applications running on updated VMs prioritized over those running on physical machines or legacy VMs. Some applications were identified as ineligible for various reasons (most of which limitations no longer exist in 2015) and these were migrated to an optimized on-premises datacenter.
After the initial set of migrations was completed, Microsoft IT completed work to make less-eligible operating system instances (OSIs) and workloads more eligible. For example, OSIs with older
operating systems or database versions were updated, more applications on physical machines were
moved to VMs, and more mission-critical applications were deemed eligible. Applications and
workloads that were identified as requiring a major overhaul were rebuilt as services on Azure.
Here’s the process:
1. Identify eligible hardware (OSIs) per Azure compute, storage, and RAM limits.
2. Identify eligible applications, remove HBI apps, sequence critical and complex apps for later, and
right-size to include more apps.
3. Increase eligible hardware and applications by doing the following:
Virtualizing more servers
Expanding to more regions
Including extranet-facing apps
Including HBI apps
Getting current (OS, SQL)
Increasing Azure VM limits
4. Build new applications as services for SaaS IT.
Source of Information : Microsoft Enterprise Cloud Strategy
Friday, February 24, 2017 | 0 Comments
Your mapping exercise leading from the current state to the desired state is the root of your cloud
migration plan. The migration plan takes the map and adds specifics such as priorities and sequencing.
You should set priorities within your plan based upon a combination of business factors,
hardware/software factors, and other technical factors. Your business liaison team should work with
the operations team and the business units involved to help establish a priority listing that is widely
For sequencing the migration of your workloads, you should begin with less-complex projects and
gradually increase the complexity after the less-complex projects have been migrated. As with running a pilot project, you will gain valuable experience while migrating applications with lower complexity and lower business risk, which can help prepare you for the more complex and more business-critical migrations.
Your cloud migration plan will be more of a process than a static plan document. In its essentials, your plan will actually be a compilation of a number of smaller plans that deal with the migration of each departmental workload, based upon the sequence you establish. The particulars of each migration will generally follow this pattern:
1. Analysis This is the process illustrated earlier in the discussion of mapping the current state to
the desired state. This process will help you to identify the gaps between what you currently have
and what it will take to migrate that workload to the cloud. Those gaps might involve changes to
the architecture of the workload or might require a complete rewrite of the program. Additionally, many legacy programs will require significant work to make them more performant and scalable, and you should identify this work during your analysis of the workload.
2. Application migration When you determine that a particular workload should be moved to the
cloud, it is a best practice to create a version of the workload with a minimal amount of data in
order to get the application working on the cloud or to build a new version of the application there.
If the application is already running on a VM, it might be possible to simply migrate the VM to the
cloud without further changes. In general, many on-premises applications can run on Microsoft
Azure with minimal or no changes, but this does not mean that the application will be optimized
for performance, scalability, and security. So, you might need to redesign and rebuild the
application, to some degree, by using modern service-oriented principles.
3. Data migration This is somewhat similar to the application migration in that the data structure
can be moved as-is to either a relational (Azure SQL Database, SQL Server in Azure VM) or
nonrelational (blob, table, queue, Azure DocumentDB, and so on) location on the cloud. Several of
these kinds of migrations are extremely easy, and you can conduct them with the help of a wizard
such as the SQL Server Azure Migration Wizard. However, you might want to consider rebuilding
the data model as a new Azure SQL Database to gain performance, scalability, resiliency, and
security improvements. If you need to synchronize data between on-premises and SQL Database
or between different SQL Database servers, set up and configure the SQL Data Sync service. In
addition, it is a best practice that you set up and configure a data recovery plan in case of user
errors or natural disasters.
4. Optimization and testing After you migrate your application and data to Azure, perform
functional and performance tests. At this phase, test your application in the cloud and confirm
that it works as expected. Then, compare performance results between on-premises and Azure.
After that, resolve any feature, functionality, performance, or scalability issues in your cloud
5. Operation and management After the testing and optimization phase, set up and implement
application monitoring and tracing with the Azure Application Insights, which enables you to
collect and analyze telemetry from your application. You can use this data for debugging and
troubleshooting, measuring performance, monitoring resource usage, traffic analysis and capacity
planning, and auditing.
You can use the Microsoft Operations Management Suite (OMS) to manage applications running
both on premises and off premises. OMS provides a single view of all your applications, regardless
of where they are hosted.
These five phases of migration will be conducted for each workload that you want to migrate.
However, there is also an iterative process that is greater than any one migration, by which you can
begin moving applications that meet your initial minimum standards, based on priority and sequence.
When the initial group is migrated, you can then begin to work on making more applications and
hardware eligible by upgrading operating system/SQL versions, getting current with all security patches, moving applications on physical machines to VMs, addressing issues caused by multiple IP
addresses, and so on.
Source of Information : Microsoft Enterprise Cloud Strategy
Thursday, February 23, 2017 | 0 Comments
Every major change in the way you conduct business entails some amount of risk; few aspects of the
cloud have generated more discussion and controversy than those regarding its security and risk. In
this time of breaches, nation-state hacking, and growing and profound concern with individual privacy on the Internet, cybersecurity has become a board-level concern, and rightly so.
Begin by understanding the security postures of the cloud platform providers. Issues to examine include the availability of antimalware software for cloud-hosted applications; the presence of intrusion detection software and tools; sophisticated and secure identity management; at-rest and inmotion encryption options; networking options for on-premises and off-premises communications; the ability to do penetration testing; and so on. The requirement to implement “defense in depth” remains; you will need to determine how you can collaborate with your cloud provider to implement
and enhance it.
You should also understand the physical security practices of the cloud provider. Are employee background checks required? Does access to the cloud datacenter require biometric authentication?
Next, because the cloud potentially makes it possible to access corporate computing devices from anywhere in the world, the information security team should address what requirements should be levied on these devices to grant them such access. For example, it might require all client devices to
have encrypted local storage by using such technologies as Microsoft Bitlocker. Similarly, because
typing usernames and passwords on mobile devices can be tedious, the team should consider the
merits of alternate forms of authentication, such as biometrics. Or, it might choose to implement
“multifactor authentication,” requiring both a username/password as well as some other form of
identity (such as a smart card).
A related capability in the cloud is its ability to accept authentication credentials from a multitude of sources by using the Open Authorization (OAuth) protocol. Information security professionals should
decide which, if any, applications may accept (for example) Facebook or Google credentials. Ecommerce sites might benefit from usage of these credentials but internal applications likely would not.
Third, verify key regulatory compliance certifications (for example, HIPAA, the Health Insurance
Portability and Accountability Act; FISMA, the Federal Information Security Management Act; and the EUDPD, European Union Data Protection Directive). Different industries and different geographies will be governed by different regulations and standards. Learn how to detect a suspected breach and how to report it to the provider, and what the response time SLA is expected to be. The Microsoft Azure Trust Center provides details on all of these as they relate to its offering. The Cloud Security Alliance is an excellent independent resource bringing together experts from across the industry to develop recommendations for best practices for secure computing in the cloud.8
Even though the cloud provides many security advantages, hosting an application in the cloud does
not entirely relieve application writers and security professionals of their responsibilities. We strongly recommend developers and testers adhere to the Security Development Lifecycle (https://www.microsoft.com/en-us/sdl/default.aspx), which provides a set of steps for anticipating
and mitigating threats. Antivirus and antimalware options should be included in your deployments. Penetration testing of deployed applications should be performed.
Source of Information : Microsoft Enterprise Cloud Strategy
Wednesday, February 22, 2017 | 0 Comments
The enterprise architecture (EA) plays a key leadership role in cloud migration. The goal of any EA team is to ensure that the highest business value is received for most efficient use of technology
resources; as such, EA provides the essential bridge between business and IT
Typically, EA maintains the list of IT capabilities and processes, facilitates the creation and implementation of IT strategies, works with businesses and executives to understand the long-term
goals of the company in order to plan for the future, and drives various enterprise-wide governance
activities such as architecture review. For such reasons, the EA team is an ideal choice to lead the
Cloud Strategy Team.
The EA team overseeing the IT ecosystem as a whole is in a position to provide the appropriate
analyses of system capabilities and application impacts of any large-scale changes to the ecosystem.
Often, it is EA that creates and maintains the portfolio management system (the catalog of applications) from which the prioritization of applications to be moved to the cloud can be drawn (we
will have much more to say about this process later). Enterprise architects should examine what is
known about the portfolio and where additional information is needed—for example, whether an
application is virtualized. The EA team should add this and other attributes to the knowledge base and engage with other parts of IT to collect the data. Other examples of such metadata will be described shortly.
Cloud migration offers the enterprise architect many opportunities. By using modeling techniques
such as business capability analysis7 and capability maturity models, it might be possible, as the
prioritization process for applications takes place, to simplify IT by consolidating applications of similar function. Consolidation will have clear financial benefits both by reducing the compute, data, and network requirements, as well as by simplifying the operations and maintenance functions. The enterprise architect, and in particular the enterprise information architect, can also use the opportunity afforded by cloud migration to analyze the data models used by applications and update them to enterprise-wide canonical models. Such an effort will streamline application integration and reduce semantic mismatches between disparate data models, which often require manual adjustment in a complex on-premises environment.
In addition, it is the EA team’s core responsibility to create and maintain as-is and to-be roadmaps of the overall IT ecosystem. The EA team should be able to easily communicate the various stages of the migration, summarizing the current thinking of the Cloud Strategy Team.
Finally, the EA team should direct the investigation into the use of new cloud technologies to either
augment existing capabilities and/or provide entirely new functionality to IT applications, and as these are validated, to add these to the existing roadmaps. Enterprise architects need to experiment with new technologies as well as understand and communicate their business value to IT management and business stakeholders. Successful investigations should lead to the development and publishing of reference architectures that applications teams can reuse.
Source of Information : Microsoft Enterprise Cloud Strategy
Tuesday, February 21, 2017 | 0 Comments
IT departments often live in a world of contradiction. On the one hand, they must “keep the lights on,” by keeping servers and networks up, by delivering reports on time, and by ensuring that systems and data meet regulatory obligations such as Sarbanes-Oxley and other forms of compliance. These
requirements are nothing if not rigorous—and essential.
On the other hand, they and their business partners desire innovation: new programs and new applications to support both new and evolving business opportunities, to better serve their customers,
and so forth. Yet the costs of IT operations—sometimes 70–80 percent of the overall budget—reduces the ability of IT to spend on new programs and innovation.
In many cases (in fact, in every enterprise we know), there are occasionally applications created and
deployed outside of the IT department in response to critical business needs. These unofficial applications are often referred to as “shadow” IT. Instead of going through the usual budget,
requirements analysis, design and deployment phases typical in the creation of a new IT application, a marketing department publicizing a new campaign might simply create a new website on their own.
Because it eliminates the capital-expense investment component (i.e., servers, storage, and network)
of application development, the cloud makes this sort of rapid innovation much, much easier. In effect, all that is needed are a few coders to write the application—and a credit card.5
IT executives should realize that this sort of innovation and experimentation is inevitable, and in many cases actually desirable. As the business climate rapidly evolves, it is critical for both businesses and IT organizations to foster rapid experimentation and innovation. It will be important to educate businesses on the importance and consequences of regulatory issues and noncompliance, of course. IT departments can actually help them by providing controlled, managed access to critical data, such as customer information, rather than letting them gather and manipulate the data on their own.
As soon as a company starts this process of envisioning and creates the culture of experimentation, it
learns a disruptive truth: in the cloud era, you must experiment, fail fast, and learn fast. It is as important to experiment in order to learn quickly both from successes and from failures. Learning
from how you succeed and what makes you fail provides the basis for delivering the disruptive
innovation and value from the cloud.
As you can by now expect, these phases shape the cloud migration principles used for the rest of the
process, these principles are go fast, push the boundaries, make data-driven decisions, simplify, and,
finally, communicate to succeed.
Go fast exemplifies the spirit of the experimentation phase. For some, it might represent a new
way of thinking for IT because, with the cloud, you can “spin up” new projects quickly with a few
clicks rather than having to plan, allotting datacenter space, procuring equipment, and so on. We
call this the try many, use best approach because the cloud uniquely facilitates the ability of IT
departments to choose the best of many solutions.
Push the boundaries suggests that wherever possible, IT should not simply adapt to the new paradigm of the cloud, but embrace it and adopt new architectures and processes quickly to best
exploit the new opportunities.
Make data-driven decisions proposes that you carefully track and measure the numbers, including the cost effectiveness of the cloud for financial reasons, system telemetry for technical efficiency
reasons, and so on. Following the data carefully will make it possible for you to make informed
decisions about which applications are generating the most return, about which you should prioritize, about which are performing well in the cloud, and where potential problem areas exist.
Simplify focuses on retiring, right-sizing, and consolidating as many services and applications as
possible. Applications that are infrequently or rarely used often generate significant costs for an IT
organization, with little return. Retiring them and consolidating them with applications that
perform similar functions can, conversely, generate savings in a number of areas such as hardware, system software licenses, and maintenance. Consider generating metrics around “hot” and “cold” applications based on CPU, network, and database utilization; for example, an application that averages two percent of CPU and has few authenticated users might be just such a “cold” application.
Communicate to succeed is the single most important mechanism that guarantees continued success and not just the migration of a single application or a service. Establish a clear and continuous communication channel for stakeholders to visualize success and impact as well as to understand the failure and the lessons learned from them. Key stakeholders remain engaged and continue to invest when they feel their participation in the joint effort required to make this a continuous journey and not just a single trip.
Source of Information : Microsoft Enterprise Cloud Strategy
Monday, February 20, 2017 | 0 Comments
Microsoft IT developed its first cloud application in 2010. It was an employee auction application, used once a year as part of the Microsoft charitable giving campaign. With it, employees donate items
(ranging from mentoring sessions, to cooking classes, to software, and even the use of an executive’s
car for a day!) and others buy them, with all the proceeds going to charity. The auction, typically held
in October, runs for a month.
Why did we pick this as our first cloud application? A number of factors led us to this decision: first, it was not a business-critical application. Therefore, news of any application problems would not cause damage to the company’s finances or reputation or appear on the front page of any newspaper.
Second, we could see the scalability features of Microsoft Azure in action. As the end of October
approached, traffic on the application continually rose, reaching a peak in the last few days of the
Finally, it was a relatively simple application whose deployment in the cloud did not require updating other applications in concert.
In the end, the application was very successful and the auction met its goals (incidentally, over the
years, Microsoft’s employees have raised more than one billion dollars for charity). Microsoft IT
learned many lessons on cloud development and deployment, which we used in subsequent stages of
our own journey. We saw the application easily scale to meet the increased demand during the course
of the month. At the end of the auction, we could shut it down and no longer pay for resources
required to run it (as we would have—for servers, operations staff, and so on—had we run the
application in our own datacenter). By every measure, then, this first experiment was a success.
There were many other early experiments in this period, trying out new approaches, testing new
features, and so on; we learned that developing a “culture of experimentation” was useful in that we
could be continuously trying new things and innovating.
Source of Information : Microsoft Enterprise Cloud Strategy
Sunday, February 19, 2017 | 0 Comments
To focus our efforts on guidance for existing applications, let’s proceed with the most convenient way
to think about modernization, which is commonly called “the five R’s”:3 retire, replace, retain and wrap, rehost, and reenvision. It’s likely that no single approach will be appropriate for all of an enterprise’s legacy applications, and a mix of differing approaches might be warranted, based on the value that an application delivers versus the cost of any given approach. Because these approaches depend highly on the situation, application, and types of cost involved, there is no one-size-fits-all solution.
Retire Of course, if a legacy application is providing little value compared to its costs, the enterprise should consider it a candidate for retirement. When few people are using an application relative to its cost impact, the enterprise needs to run a cost-benefit analysis to determine if it is worth the expense. Additionally, some functionality provided by legacy systems may be rolled into a consolidated modern application running in the cloud, allowing some applications to be retired while others are replaced and modernized.
Replace Often, a legacy application is providing some value, but an off-the-shelf replacement
with a lower total cost of ownership (TCO) is available. Many legacy applications were originally
built because there was no alternative at that time. A modern, readily available application that is
better suited to running in the cloud—most cost-effectively of all, a SaaS application—may now
exist that can be used to replace the older one. Also, when a legacy application is replaced with a
more comprehensive modern solution, there might be a chance to consolidate functionality from
several older applications, thereby replacing multiple applications with a single system.
Retain and wrap If a legacy application is providing good value and not incurring a high TCO,
the best approach might be to retain it but put a modern “wrapper” around it in order to gain additional value and benefits. Examples of the “retain and wrap” approach include the following:
Wrap a legacy application within C# in Microsoft Visual Studio, add web services to the application there, and then add a layer of orchestration around those web services.
Extend a legacy application with third-party tools; for example, using a C# wrapper around an older technology such as COBOL. Apply the benefits of the wrapper on top of the core technology in new, more modern ways, such as facilitating the development of mobile tools.
Rehost If a legacy application is providing good value but is expensive to run, it might be a candidate for rehosting. Rehosting involves keeping the same basic functionality, but moving it to
the cloud where it is easier to manage and less expensive to run. This is also called “lift and shift.”
In a rehosting situation, the legacy application might be currently located either on a local VM or
on local hardware. Some VMs might be eligible to move with a simple migration. Those on local
hardware might be able to be converted with a physical-to-virtual migration and then hosting the
VM on the cloud. Some VMs, especially older ones, might not migrate easily to the cloud without
some significant work. In those cases, you might want to consider reenvisioning and building the
application in the cloud.
Reenvision If a legacy application is providing good value but cannot be easily migrated, the best solution might be to reenvision it and build it again on the cloud. Reenvisioning is a process of rebuilding the application in the cloud using modern technology, a new architecture, and best practices; it normally also involves adding more business value to core functionality, such as
improving market differentiation. Reenvisioning an application might require rewriting the main
logic by using a modern development language and tools and making it service oriented.
Reenvisioning an application can be facilitated by starting with VMs in the cloud, which can be
instantiated in a matter of minutes.
Source of Information : Microsoft Enterprise Cloud Strategy
Saturday, February 18, 2017 | 0 Comments
Before we go on, it’s worth noting that the cloud provides an opportunity to consider the IT ecosystem as a whole and how it can be modernized. As you shall see, cloud migration at scale involves looking at each application and determining how it should be thought of in this new environment called the cloud. Is further investment in certain applications justified? Should they be retired?
Many enterprises have held their applications for far too long without assigning to them a maintenance or retirement schedule. Therefore, for fear of complexity, lack of documentation, resources, source code, or other reasons, applications remain untouched.
Even for applications that remain on-premises, modernization can save time and money. An internal Microsoft IT study in 2010 demonstrated that the number of problem reports (“tickets”) and the time to resolve them increased with the age of the application and system software. (This analysis led to a focused effort to ensure that all applications were on the latest version of the operating system and other systems software such as database.)
Moreover, and more important, migration to the cloud provides an opportunity to evaluate and modernize applications and, in particular, their business logic. This activity can provide great returns on investment and impact to the top-line revenue.
There are many motions that one can take to modernize application and services portfolios, such as the following:
Rehost Move a VM or an operating environment from the on-premises datacenter to a hoster
or a cloud. This model is also known as co-location.
Replatform A legacy environment becomes unsustainable based on cost or operational requirements; a solution is to “retain and wrap” the application without making changes to the code, possibly compromising the integrity and security of the operation.
Retire and Rewrite (or Reenvision) When there are sufficient new requirements that cannot be met by the older environment, the best way to proceed is to rewrite the application in a newer, better-suited environment. Often this occurs when examining the portfolio of applications and
consolidating several that have similar function.
Burst out With all of the new compute, data, and service models that are being provided in cloud environments, each providing capabilities and capacities that where never before accessible to an IT environment, many applications are bursting out to the cloud. These applications are doing innovative types of analytics, reporting, high-performance computing, visualization, and so on. Keeping frequently used (“hot”) data locally while aging-out infrequently accessed (“cold”) data to far cheaper cloud storage is another common pattern.
Expand Enterprises are now exploring how to expand their older applications and how to add functionality to provide to mobile devices and web front ends the same capabilities that previously were limited to a computer screen. They are even moving to enhance the applications with search or video services, as an example.
Cloud-Native Applications As companies begin their investigation of the cloud, they frequently realize that there are new forms of applications like Big Data, new types of analytics, entirely new capabilities such as machine learning, and applications for the Internet of Things (IoT) that are uniquely fitted to live in the cloud.
Source of Information : Microsoft Enterprise Cloud Strategy
Friday, February 17, 2017 | 0 Comments
What if you were able to achieve both efficiency and innovation in all the business domains and applications across your entire portfolio? What if you could take advantage of the cloud and all of its resources and features to get a “the whole is greater than the sum of its parts” effect? With a good roadmap to lead the way, you can. This chapter covers what it means to move your enterprise to the cloud. We’ll provide examples and learning experiences from Microsoft’s own journey, as well as from those of our customers.
In any transformative change, it’s important to understand what the destination is and what the waypoints along the journey will be. There are multiple potential destinations for any application, and
IT cloud deployments will be a mixture of them:
Private cloud In a private cloud, cloud technologies are hosted in an on-premises datacenter. Private clouds can be useful because they can implement a technology stack that is consistent with the public cloud. This might be necessary in scenarios for which certain applications or data cannot be moved off premises. However, private clouds do not provide the cost savings and efficiencies that the public cloud can, because private clouds require a significant capital expense budget and a (potentially large) operations staff.
Infrastructure as a Service (IaaS) In IaaS, the application virtual machines (VMs) are simply moved from on-premises to the cloud. This is the easiest migration strategy and has many benefits, including cost savings. But, it still means that your operations staff will need to perform such tasks as patch management, updates, and upgrades. Nevertheless, IaaS is one of the most common cloud deployment patterns to date because it reduces the time between purchasing and deployment to almost nothing. Additionally, because it is the most similar to how IT operates today, it provides an easy onboarding ramp for the IT culture and processes of today.
Platform as a Service (PaaS) In PaaS, the cloud provider maintains all system software, removing the burden of upgrades and patches from the IT department. PaaS is similar to the traditional three tier model of enterprise software, having a presentation layer (called “Web Role”), a business logic layer (called “Worker Role”), and persistent storage (Microsoft Azure SQL Database or other database). In a PaaS deployment model, all that the enterprise needs to focus on is in deploying its code on the PaaS machines; the cloud provider ensures that operating systems, database software, integration software, and other features are maintained, kept up to date, and achieve a high service level agreement (SLA).
Software as a Service (SaaS) In SaaS, you simply rent an application from a vendor, such as Microsoft Office 365 for email and productivity. This is by far the most cost-effective of all the options because typically the only work involved for the IT department is provisioning users and data and, perhaps, integrating the application with single sign-on (SSO). Typically, SaaS applications are used for functions that are not considered business-differentiating, for which custom or customized applications encode the competitively differentiating business models and rules.
The hybrid cloud Many enterprises might choose to keep some applications on-premises— perhaps they are based on nonstandard systems or out-of-date software, or perhaps they will remain on-premises while waiting for their turn to be migrated to the cloud. In this model, some applications run in the cloud, whereas others remain on-premises, requiring a secure, high-speed communications path between the two environments. In a way, then, the cloud becomes an extension of the existing datacenter, and vice versa.
Source of Information : Microsoft Enterprise Cloud Strategy
Thursday, February 16, 2017 | 0 Comments
Listing all the available tools and then making an informed choice as to which tools are the most suited for the exercise at hand helps to give the necessary aid towards a more successful and well planned and executed foray into the online arena.
Based on the previously disseminated information which is recapped as having passion, learning to communicate, learning to relate and building a community, an individual should be fairly well equipped for the journey into the online arena.
Making sure to include all these into a blogging exercise will help to enhance the possibility of better traffic flow which in turn creates the necessary popularity levels that make the said blog a success.
There are also other elements that can contribute in a positive way towards a well planned entrepreneurial endeavor. These may include the use of networking in other ways which can be equally beneficial to a blog.
Taking the time and effort to always stay current and relevant is definitely the way to unsure a better percentage of success. It could also show the individual’s level of commitment to the endeavor at hand and as such creates a level of expectation and confidence in the followers of the blog.
Wednesday, February 15, 2017 | 0 Comments
Taking the time to build a workable and comfortable community will go a long way in terms of resources, ideas, expertise and many other positive contributing factors that play a vital role in any situation.
Community existence is very important on many levels and thus should be nurtured and encouraged as much as possible. One of the best ways of building a community is to invite people to join in and actively contribute in any way that they may find beneficial to the overall entity.
Exchanging ideas could be a good starting point, and the internet is an ideal platform for this exercise. Using the many tools available on the internet one can actually make contributions on many different levels that are both beneficial and enlightening to others.
This of course a welcome respite from having to deal with things individually, as with a community working together more things can be accomplished.
Using the various internet tools to encourage participation online is also another avenue that can be tapped as it can be limitless. When there is a sense of community commitment, the strong sentiments felt helps to boost even the lamest endeavor which in turn has the potential of producing amazing results.
Building a community also helps when it comes to promoting anything because the network available is vast and effective. The exchange of information often happens at a very fast pace and can effectively spread to the intended audience without any hindrance.
Tuesday, February 14, 2017 | 0 Comments
Empathy can go a long way in making other feel understood and accepted. When condemnation is kept to a minimal and exercised only when absolutely necessary, then those involved will be able to function more effectively as they are confident of being accepted and respected for their contributions and efforts.
Though some quarters advocate keeping some level of formality and distance between the different levels of any working relationship, taking this too seriously will eventually dissolve the ability to relate to different people and different situations.
Such an environment can cause a project to stall or even struggle to reach its success because those involved are not able to relate to each other’s needs and thought processes.
Taking the time to understand and perhaps even lend a helping hand or giving well placed advice shows the individual’s intention of wanting to relate better and create a more conducive relationship or environment.
Another way to create the perception or willingness to relating to something lies in the effort expounded to actually trying to learn more or get more involved in the endeavor or situation.
This gives all concerned an opportunity to understand the various aspects involved which in turn could contribute to a better level of tolerance and success.
Avoid forming opinions and making remarks until one is completely sure of all that has transpired. By this simple act of patience being practiced throughout the exercise of gaining full knowledge of any situation, the individual is able to better relate to the different aspects of the said situation and thus render a better overall judgment which will then be accepted as being relatable.
Monday, February 13, 2017 | 0 Comments
Understanding that people generally respond in a variety of ways to the various style of communication is perhaps a very important advantage to acknowledge.
Using one style of communication for everyone and everything is both unwise and can contribute to a lot of wasted time and effort. Also choosing the best communication methods will allow the intended message to be clearly conveyed and properly understood so that whatever is required can be done with minimal or no mistakes at all.
Some tasks may require written communications while others would be better addressed with a more interactive style like verbal communication, and yet other may require some form of visual or a combination of visual and audio communication. Thus finding out which style best suits the situation is very important indeed.
When it comes to more urgent and formalized communications there may be a need to have a combination of styles to ensure all parties are clear on what is being communicated and their individual roles to ensure success.
The lines of communications should always be accessible and non threatening, as this will help ensure all participants are comfortable enough to ask or question where and when in doubt. There are also some areas where simple verbal or written communication is not enough and this is where there may be a need to have complimenting charts and other visual aids to help paint a clearer picture as to what is required.
The important point to always remember is to ensure all that is being communicated is well received and understood before moving onto the next step in any situation.
Sunday, February 12, 2017 | 0 Comments
There are several ways to know if one is really passionate about a particular endeavor. Below are just some indications that are worth exploring to ensure one has what it takes to succeed:
Once a project has been identified and the interest is apparent in seeing the project to its end, the individual should start the process of learning as much as possible about the varied aspects of the said project. This will give the interested party an idea of what the project requires on every level until the end is reached. It will also give the individual a chance to evaluate his or her interests against the possible problems that the project presents. If the confidence and passion is still not shaken or diminished even a little then there is a good possibility that the project chosen is a suitable match for the individual.
Being around other people who have the same interest and still feel the desire to explore further into the said interest is another good indication of the level of passion available for commitment. Living and breathing as some would refer to the sentiment required to be passionate about something will be very apparent when there is a strong desire to be immersed in the interest to the fullest.
Being able to spend a large portion of one’s time devoted to the endeavor also is being passionate about it. When the time spent is considered pleasurable, the level of passion is evident. However being overly consumed with being immersed with the interest could also project some level of unhealthy passion, especially when the commitment is disruptive to other aspects of the individual’s life.
Saturday, February 11, 2017 | 0 Comments
Among other things Steve Jobs was also known for his super individualistic style, anti political stand, while being mostly a libertarian at heart, he would take any opportunity to discredit any and all economic doctrines whenever possible if it did not conform to his unique thought processes. This of course either rightly or wrongly perceived him to be constantly against a society where conformist behavior patterns are the order of the day.
Adopting the styles where interaction and sharing of ideas was the basics of the work teams, Steve Jobs often encouraged a varied group of individual of all levels within the organization to talk, discuss, and argue about ideas being tabled. In this way he would then be able to gauge the sentiments and acceptability of anything being designed or to be launched.
Exploring this particular style is definitely an advantage that should be adopted and practiced as often as possible within any organization as the participation of varied mindsets is both informative and beneficial.
Believing that the person at the helm of every endeavor should be committed and competent to the highest levels, he also saw the wisdom in ensuring that individual would be loyal to the Apple brand rather than other elements like self interest, pleasing the boss or any other elements.
Friday, February 10, 2017 | 0 Comments
Social media has officially gone mobile. But before you can begin crafting a social media marketing strategy, you need to first refresh your understanding of what makes something qualify as “social media”. After all, social
media and social networking are much more than just Facebook and Twitter.
According to Merriam-Webster.com, social media is defined as: “forms of electronic communication (as Web sites for social networking and micro-blogging) through which users create online communities to share information, ideas, personal messages and other content (as videos)”. In order for campaigns to be successful, social media has to be at the HEART of social media marketing campaigns. It isn’t enough to merely send out an offer, the goal in social media is to create a user experience. To build a buzz. To get people interactive.
Undoubtedly, social media is HOT! As one person put it, Facebook has been able to accomplish in less than 10 years more than the CIA has been able to do in their entire lifetime. There are billions of conversations going on right now. The trick is to figure out how to get them talking about you and your enterprise. That’s the good news…
The bad news is that social media is a complex matrix, and measuring conversion statistics is tough. Monetizing social media in any way is quite a challenge and certainly isn’t something that happens “by accident”. Whereas the typical eCommerce site, complete with a hard-hitting, high-converting landing page is an example of direct response marketing , social media is a lot more like public relations. It often proves to be a much more effective communication tool than a direct conversion tool. While there are examples of mobile-inspired social media campaigns that translate into sales (think Groupon announcements, and lowest gas price apps), if you are content to kick-start a lead generation initiative or are merely looking for a way to build buzz or increase traffic, success doesn’t have to be measured in dollars.
Thursday, February 09, 2017 | 0 Comments
We all get those “pie in the sky” dreams about what we want to do or what we will do with our websites. However, far too often, those dreams get so grandiose that it’s easier to abandon them than it is to pull them off. We’ve all been there, at one point or another. What if you don’t get around to building that customized website and just stayed exactly where you are right now? What would be the opportunity cost of that mistake?
Fortunately, you can guard yourself against worst case scenarios by taking the time to implement the following tasks that makes the most of the website you ALREADY HAVE!
1. Follow the basic rules of traditional SEO. Don’t spam keywords and phrases. Pay attention to relevance. Put some thought into it, but never sacrifice the human audience experience in order to attract search engine attention. Always aim for organic principles and best practices.
2. Use XHTML coding. Not only is it more forgiving of bugs and errors, it’s one of the universal languages of mobile devices and generally insures accessibility across multiple platforms.
4. Shy away from Flash and other custom apps that may not function across various devices. Remember the old principle of everything - KISS (keep it simple, stupid!)
5. Don’t just optimize your website for the mobile experience - do the same for any videos or audio that you offer on your site.
6. Learn how to use what are known as “jump links” to quickly navigate to important parts of the website that may not be viewable on the mobile screen.
7. When specifying the size of graphic elements, don’t use pixels. Instead, rely upon percentages, that will translate across the board.
8. Don’t forget simple coding changes, like making any phone numbers click-n-call coded. This can save a lot of time. If you’ve ever been on your phone and hit up a search engine for information on a company and then had to hunt down a piece of paper and a pen in order to write it down instead of simply clicking to call, you know exactly how frustrating the oversight can be in the eyes of the end user.
9. As mentioned previously, submit your mobile sites to mobile search engines.
10. Take the time out to create a simple mobile site map that will allow easier navigation for your traffic and easier spider indexing for the search engines.
Wednesday, February 08, 2017 | 0 Comments
When it comes to Internet marketing, it’s all about SEO. When it comes to mobile marketing, it’s all about MSEO - Mobile Search Engine Optimization - a critical step that must precede any successful campaign. When mobile device marketing started out with a lot of restraints and constrictions, developers were generally limited to WAP (Wireless Application Protocol) or the .mobi domain in order to get their messages across. When greater choice and competition entered the mobile arena, all of that changed. Today’s mobile experience can easily measure up to the traditional website experience; however, it can’t be accomplished without a bit of research and hard work on your part.
Wondering why it matters? Let’s think about it like this: According to iCrossing, there are about 1 billion computer users, compared to a whopping 2.3 billion mobile subscribers - and that was about three years ago! As mobile devices grow increasingly sophisticated AND affordable, growth and increased traffic forecasts will remain promising.
When it comes to creating a specific, separate site for mobile devices, this is - admittedly - a bit more difficult, but still do-able. Doing it well is something you can do yourself, assuming you arm yourself with the right tools and tricks. Smashing Magazine offers a fairly in-depth tutorial that will arm you with what you need to know if you just visit:
Local directory submission is simple and free (if you do it yourself), but you can often benefit from service providers who specialize in multiple (and often targeted) directory submissions at incredibly affordable prices. For the equivalent of about $33.00 a month, you can enlist the aid of of www.majon.com to gain access to a listing in over 100 different local directories PLUS a custom created video that you can use to promote your offer. However, if you’re intent on doing it yourself, there is a great list to start out with that you can find for free at:
Tuesday, February 07, 2017 | 0 Comments
It doesn’t take a rocket scientist to figure out that potential website traffic who want to experience your site on their mobile devices have remarkably different wants and needs when it comes to the browsing experience. First of all, most mobile devices - up to and including the tablet PC - have much smaller screen resolutions than the laptop and desktop playing field. The least sophisticated of Web-capable phones (those with some Internet access, but below 3G standards) only offer a few square inches for the viewing experience and are slow to load many graphics. Even a large tablet computer will be noticeably smaller screen than even a compact laptop screen (though comparable to the netbooks that did become popular prior to the tablet’s emergence).
Then, there are operating systems to take under consideration. Obviously, Apple products will run markedly different programs than those on a Windows or Android based mobile device. All of this will mean that you’ll need to know how to manipulate your traffic’s experience based on the mobile devices you know they are using.
Obviously, enterprises operating on shoe-string budgets don’t have the luxury of tackling each and every opportunity at once. Therefore, there needs to be some system of differentiation and prioritization. Finding out which devices and systems to target can be a bit intimidating, but there are some fairly easy ways that you can learn what you need to know. One of the fastest and most affordable is to conduct a simple survey.
The Mobile Marketing Preliminary Research Campaign
Surveys are incredibly sensible, simple and highly economical - especially if you are already using an email marketing client that allows for the creation of simple surveys for your list subscribers. What you do then is create a simple survey - preferably with no more than 5-10 options for your subscribers to chose from - to indicate which mobile devices they regularly use. Consider making this a check-the-box survey where multiple answers are allowed and encouraged.
You can greatly increase the conversion rates in regards to fishing out the specific information you need by offering some sort of incentive for survey completion. This might mean offering access to an exclusive free report, a discount code for use with the next purchase or some other low- to no-cost option that will sweeten the deal and encourage the people who are most likely to respond to your mobile marketing initiatives to tell you the best way to get it done.
Put a solid deadline on your survey incentive so that you can quickly and easily tabulate the answers for an overall understanding of which mobile devices will offer you the greatest financial rewards. You can spruce up your survey with additional questions, asking subscribers which ways they prefer to receive communications. You might find that even though a lot of people have iPhones, if they do, they are only interested in specific apps, which you’re not ready to produce at the moment. If this is the case, it sure helps to know that they prefer to receive text messages and discount codes, etc. - a campaign that is often well within reach of even the most cash-strapped establishments.
Once the deadline has passed, it’s time to tally up the information. While your survey client may offer you free reports that break down the information you need and ignores what you don’t, this may not be the case. If you need to do it the old-fashioned way, here are some of the more important things you want to find out:
1. How many total subscribers were invited to participate in your survey?
2. Of those, how many subscribers DID participate?
3. Which mobile device had the greatest popularity? Work your way down the list until you have created a glimpse of your market’s mobile capabilities.
4. If you asked other questions, tabulate the answers here. For example, if you asked which method was preferred for marketing messages, what answer came out on top? Which was the biggest loser.
Now that you are armed with a clear overview of your target market and their mobile marketing preferences and options, prioritize your list of options and ready yourself to visit some of the following websites for more information on coding your site to various specifications for easy access across the board.
Monday, February 06, 2017 | 0 Comments
How easily we forget a world that once operated without benefit of cellular phones and mobile technology... Undoubtedly, it’s a trend that is here to stay - at least until something more novel, more efficient or entirely new technologies are created and deployed across the planet. From the first car phones on to the most basic of cell phones, change started gradually but has increasingly gained momentum. After all, an object in motion tends to stay in motion.
While mobile devices are widespread, it’s important to understand that all mobile devices are NOT created equally. Different types of devices will require different coding and different apps and different campaign results and execution. Over the next few pages, we’ll cover some of the
most popular and prevalent mobile devices currently on the market. We’ll highlight the pros and the cons of each, framing the review in such a way that you can plug in your unique situation and work through the process of discovering which mobile marketing campaign will have the greatest pay-off in your scenario. More importantly, a moment will be devoted to discussing the demographics that typically go along with each of them. After all, you want your mobile marketing campaign to actually reach the people your organization is attempting to advertise and entice.
First there was the iPhone. Then HTC brought Droid onto the system. Now, most cell phone plans offer options (and encourage upgrades) to SmartPhone technology, an increasingly digital and web-based mobile device. SmartPhones like the iPad and the professional office’s Blackberries offer a number of potential perks that mean that organizations and causes have increasingly more (and more interactive) ways to draw in new customers with highly targeted and specialized campaigns that can really boost your bottom line. Examples of these perks include:
• 3G & 4G offer the ability to quickly download, message, send and receive emails and surf the mobile web.
• GSM (the Global System for Mobile Communicating) means that a phone is capable of communicating pretty much worldwide, and is ideal for those who spend a great deal of time jet-setting between countries and continents.
• Bluetooth means hands-free talking and operation while driving and otherwise on the go. Of course, just because there no hands or wires required DOESN’T mean that you are better able to handle the demands of conversing while driving.
• Wi-Fi access means that SmartPhones can piggy-back off of free wireless signals in designated areas, translating into Internet on the go without Internet sticks or satellite connections.
• USB connectivity to hook up and communicate with other computers and devices in your home or office network.
• Infrared capability translates into the ability to transfer information and data across short distances to other infrared devices.
The only real drawbacks when it comes to mobile marketing to SmartPhones stem from the fact that various devices use different resolutions, browsers, coding and other factors that can mean that you as a marketer have to take steps to insure that it is a seamless visit on all of the major phones and operating systems out there. While this can be a headache, it isn’t “hard” to do, just time intensive. Fortunately, once it’s done, it’s done and the rewards will soon make themselves quite evident.
Once the iPod and iPhone took the world by storm, it was only natural that the iPad would emerge. And when it did, it caught on like a wild fire. Today, Apple no longer dominates the field of tablet PCs, meaning that more individuals are able to afford and buy them. Tablet PCs offer most of the benefits of a laptop computer without as many of the drawbacks. Unlike cell phones, however, most do not receive calls or traditional text messages. Instead, they browse the Internet, create documents, take pictures, send information and data via email, put on presentations and much, much more. Businesses use them. Creative professionals rely o n them. They simply make life easier.
Coding your HTML for Tablet PC compatibility for your website campaign is relatively easy, and offers more sophisticated options above and beyond mere apps. Presentations are a breeze, and offer bells and whistles, along with easy access to web connectivity when needed, especially for those who are in range of cities who promote an expansive wireless network.
When it comes to building a mobile marketing campaign around that specifically caters to or targets the same type of demographics that are heavy users of tablets, one of the best ways is to think like the consumer and end user of your product. The more you can make the meld between your software package, training manual, book, membership site, retail site, etc., the greater the probability that a query will eventually translate into a sale.
Sunday, February 05, 2017 | 0 Comments
The mobile marketing outlook is a positive, to say the very least. In addition to overtaking traditional PC Internet access in the next few years, there are other predictions that should set your profit senses tingling. With the advent of low-cost, no-contract cellular plans, virtually everyone has AT LEAST access to text messaging capabilities. Although these messages are easy to delete and often fail to captivate interest or prompt action, the sheer reach of text messaging means that even minimal conversion rates can spell incredible revenue potential when attention is paid to targeting and tracking the analytics.
Think about it like this: the iPhone caught on an incredible 10 TIMES faster than America Online did, oh so long ago when it took the world by storm. Add to that the fact that Unisys reported that it can take the average person 26 hours to report a lost wallet. If the same average person loses their cell phone, a mere 68 minutes will pass before it is reported as lost or stolen. Want another way of looking at it? According to Mobile Marketing Association Asia, of the roughly 7 billion people on the planet, 5.1 billion of them have a cell phone. Comparatively, only 4.2 of them own a toothbrush!
Now, here’s some even better news... Thanks to the folks at Mobile Marketer, we now know that 70% of every mobile searches will result in action within 1 hour and mobile coupons have an impressive 10 times the redemption rate of traditional coupons - (thank you, Borrell Associates).
The trends for 2012 indicate a number of things which can heavily impact your enterprise. SmartPhones are expected to overtake other mobile options this coming year, meaning that while text will still be a popular, profitable medium, apps will grow in profit potential and demand. Social networking will continue to be one of the demands that mobile consumers are seeking to accommodate their addictions to status updates and tweets. Social gaming is another aspect of mobile marketing set to take flight, as proven by the global phenomenon of Angry Birds(TM). Furthermore, location-based targeted marketing campaigns will be easier to envision and implement, translating into more foot-traffic for brick-and-mortar establishments who may have previously been hesitant to dip their toes in the world of mobile marketing.
Saturday, February 04, 2017 | 0 Comments
Mobile marketing isn’t exactly a “new” thing, but it does happen to be “the” thing. While social media may have been the buzzword of 2010 and 2011, even FaceBook has gone mobile. When industry giants recognize and cater to “trends”, they are no longer a trend but an ever-present reality. It takes on a number of different forms, ranging from the “simple” SMS text message all the way to full-fledged mobile apps and mobile ready video and commercials. Entire books could fill up the spaces in between.
In a nutshell, mobile marketing is taking your message to today’s cellphones, SmartPhones, tablet PCs and other mobile devices that are tucked into purses, pockets and other easy-to-reach locations across the entire globe. It could mean offering a mobile version of your regular website, offering a banner advertisement on another website, insuring that your email marketing is mobile-ready, developing an app, partnering with an established mobile advertising agency, getting your mobile site listed on the mobile search engines and much, much more.
Some enterprises will be content with a simple, one-dimensional approach to mobile marketing, and may only use one or two of the options. Others will create a richer, more comprehensive strategy that brings all of these components into play. Often, which outlets you decide to use will depend greatly upon individual and unique factors that only you fully understand. We aren’t here to recommend one type over another, only to encourage you to make the strategic move towards integrating mobile marketing into your overall marketing mix.
Friday, February 03, 2017 | 0 Comments
We live in a world that is always on the go. And thanks to the advent of new technologies, we always take our ability to communicate with us. What began in the 90’s with the beeper fad is still alive and well today, with
SmartPhones and tablet PCs that put the World Wide Web at our fingertips no matter where we happen to be. It’s a new and relatively uncharted territory with a virtually limitless profit potential. The problem is that the world of mobile marketing and advertising is one with blurry boundaries and undefined borders.
One of the biggest problems in the world of mobile marketing integration is that too many enterprises view mobile marketing as merely an extension of Internet Marketing instead of it’s own new universe. While there are a number of similarities that make this mistake such a prevalent one, the reality is that mobile marketing is such a widely segmented niche that it is worthy of consideration as a separate entity with an increasingly expanding reach.
Think about this incredible tidbit - It is estimated that by the year 2014, phone Internet usage will surpass that of traditional desktops and laptops. That’s a mere two years away.
If more and more of your target market and customer base will be transitioning to mobile technology for their World Wide Web access, isn’t it crucial for your business to anticipate this shift and build a mobile presence that can handle heavy traffic and specialized demands?
Thursday, February 02, 2017 | 0 Comments