...

LEADERS FUTURE D I G I T A L PLAN FOR THE

by user

on
Category: Documents
36

views

Report

Comments

Transcript

LEADERS FUTURE D I G I T A L PLAN FOR THE
E N H A N C E
Y O U R
I T
S T R A T E G Y
LEADERS
bcs.org/digitalleaders
PLAN FOR THE
FUTURE
2 0 1 5
D I G I T A L
CONTENTS
Sponsor’s article (Landesk)
BCS
10
Digital maturity or delusion?
Jos Creese,
BCS President
12
Wants and needs Brian Runciman
14
Business Analysis
Analysis-as-a-service20
James P. Howard & Scott Beaumont
Complexity challenge22
Martin Sharp
Sharks and icecream
David Geere
24
Enabling success26
Debra Paul
User-centered IT for
the age of everywhere
A modern IT department knows that its primary objective is to enable the
productivity of the company’s end users, and to do so in a way that is
controlled and secure. LANDESK has your solution. Using our integrated end
to end service management solution, only LANDESK unifies and automates
the delivery of all the assets and services users need.
Want to find out more?
Call us on: +44 (0) 1344 442100
Email: [email protected]
www. landesk.com
Communications
What is collaboration?
Phillipa Hale
28
Mobile: More than apps
Richard Phillips
30
Communication pitfalls32
Justin York
Cloud Computing & Data Management
Quality data
Lee Warren
34
Future differentiator36
Bob Mellor
Information management38
Dr C. James Bacon
Modelling matters40
Stephen Ball
2015 DIGITAL LEADERS
03
CONTENTS
Entrepreneurs and Start-ups
Building your business
Richard Tubb
42
Incubate/Accelerate
Stephanie Allen
44
Smart specialisation
Christopher Middup & Alex Borg
46
Green IT
THERE ARE MORE
ATTACKERS THAN
DEFENDERS
IT recovery
Danny Carri
48
Life sustaining
Bob Crooks
50
Data tsunami
Neil Morgan
52
Internet of things
Manage the threat.
Our information security portfolio
delivers the knowledge and capabilities
you need to keep your information safe.
bcs.org/infosecurity
Intelligent buildings
Ed Macey-MacLeod
54
Global vision
Gordon Fletcher
56
Hub of all things
Peter Ward
58
BC1062/LD/AD/0115
Learning and Development
© BCS, The Chartered Institute for IT, is the business name of The British Computer Society (Registered charity no. 292786) 2015
Leadership DNA
Beverley Corcoran
60
Learning organisations
Sara Galloway
62
Meaningful mentoring
Jill Dann
64
Processes and people
Dr Janet Cole & Dr Chris Littlewood
66
E-learning business analysis
Julian Fletcher
68
2015 DIGITAL LEADERS
05
CONTENTS
Project Management
Agile evolution
Robert Cowham
70
Death of project management
Bryan Barrow
74
Testing times
Frank Molloy and Mark Carter
76
Two speed IT
Steve Burrows
78
Lean projects Gary Lloyd
80
We are all project managers
Stephen A. Roberts and Bruce Laurie
84
Lines of defence
Peter Mayer
86
Politics kills the project star 90
Dr Stefan Bertschi and Gandin Hornemann
Security
A new hope
Guarav Banga
92
Anti-social media
Reza Alavi
94
Changing spots
Elke Bachler
96
Cyber risk register
Shaun Cooper
98
Are you ready to drive business change?
Cyber war and you
Szilard Stange
100
Our comprehensive range of business and IT books will help your IT team
develop the skills to lead that change.
Fear, uncertainty and doubt
Vineet Jain
102
bcs.org/bookshop
Generation game
Louise T. Dunne
104
Telephone +44 (0) 1793 417 440 or email [email protected]
Service Management
Back to the future
Chris Evans
106
© BCS, The Chartered Institute for IT, is the business name of The British Computer Society (Registered charity no. 292786) 2015
BC1142/LD/AD/0415
2015 DIGITAL LEADERS
07
CONTENTS
Careers in IT Service Management
Doing IT wrong
Steve Sutton
108
Eliminating waste
Dr David Miller
110
Five steps
Michael Moulsdale
114
The future of servicemanagement
Shirley Lacey
116
Software Development
Trustworthy software
Alastair Revell
118
Cloud-based software systems
Matthew Skelton
120
Faster, greener programming
Jeremy Bennett
122
Strategy
Coming soon
This series of practical books provides
informed career development guidance
for IT professionals.
With CPD advice, skills development, tips
and case studies, they are ideal resources
for all IT departments.
bcs.org/ITroles
Telephone +44 (0) 1793 417 440 or
email [email protected]
Efficient supply chain
Robert Limbrey
126
Communicating globally
George Budd
128
CIO driving seat
Christian McMahon
130
Strategic opportunity
Adam Davison
132
Connecting the dots
Oscar O’ Connor
134
Women digital leaders
Jacqui Hogan
136
Changing world
Bruce Laurie and Stephen A. Roberts
138
Productivity drain
Jeff Erwin
140
Computer Arts
Computer art today and tomorrow Catherine Mason
142
© BCS, The Chartered Institute for IT, is the business name of The British Computer Society (Registered charity no. 292786) 2015
BC1145/LD/AD/0415
2015 DIGITAL LEADERS
09
SPONSOR
INTEGRATE TO INNOVATE
Ian Aitchison, Director Product Management at LANDESK, explains how integrating and automating
existing service management processes can free up time for valuable business innovation.
In an age of cloud-based solutions, app
stores and consumer-driven business
technology, organisations increasingly
expect more innovation from their IT
departments.
The IT function itself wants to transform
and offer new services, but limited
resources, skills and time make it hard
to innovate, particularly given the effort
involved in just keeping the lights on.
The IT department’s need to follow
corporate and industry best practices can
encourage the perception that it is slow
and bureaucratic—especially in the face
of ‘shadow IT’, where business managers
simply bypass corporate controls to deploy
new solutions.
To drive business success and
individual productivity, the IT department
should seek opportunities to adopt
new techniques for integrating existing
components into new services.
Increasingly, this will mean combining
10
DIGITAL LEADERS 2015
technology and services, whether
on-premise or cloud-based, to create
innovative, rapidly deployed solutions
that raise IT’s profile as a contributor to
business value and agility.
As the IT department embraces its
forward-facing, transformative role, it
won’t abandon existing practices. Rather,
in what Gartner terms ‘bimodal IT’, it will
both maintain the rigour to keep core
systems running and develop the flexibility
to respond rapidly to new business
requirements. Happily, the necessary
ingredients to realise the vision are usually
already in place.
Integrate through automation
With the business increasingly looking to
over-burdened IT departments for
innovation, the easiest path may be to
assemble the existing components of the
application landscape and IT infrastructure
in new ways.
Plugging technologies together was
traditionally a tough job, requiring
engineers and developers to spend
significant time and money.
However, new approaches for defining
processes around human and technology
interaction make it far easier to integrate
services to create a dramatically better
user experience.
What’s more, by using automated
workflows to stitch business processes
together and create new value for
the business, the IT department can
simultaneously release staff from
repetitive tasks and focus on further
innovation.
Using automation technology to
integrate process flows can eliminate
human error and delays, ensuring that
user requests are handled faster and
more effectively. This also frees up
talented IT staff from low-value work,
giving them time to really use their skills.
A simple example of how this can
work is request fulfilment in the field
of delivering software and services to
individuals when requested. Rather than
requiring users to submit a request, wait
for approval, and then wait for an engineer
to perform the installation, IT departments
can follow the consumer model and create
a corporate app-store experience.
The required steps might be: take
a self-service service catalogue and a
request-fulfilment process from service
management, connect it to systemmanagement technologies, and give users
the ability to request and then receive
instantly deployed, pre-approved software
and services on their own devices. Even
where approval of purchasing is required,
this can be largely automated. And this
doesn’t just improve the user experience,
it’s also more cost-efficient, as the same
where the uncontrolled, piecemeal
approaches of shadow IT fall down.
Integration is all about business processes,
so existing IT service management
principles and skills will continue to be
absolutely relevant.
By layering new automation capabilities
on a foundation of proven, established
frameworks, IT departments can maintain
control, consistency and adherence to best
practices, while delivering a much faster
and more integrated service to users. At
the same time, they can re-direct staff
towards higher-value, more satisfying
and more creative work. Their years of
accumulated IT experience and expertise
will continue to add value, and they will
also have more time to bring new concepts
and ideas to the business.
In the typical over-worked IT
department, time is a luxury item in very
Using automation technology to integrate
process flows can eliminate human error and
delays, ensuring that user requests are
handled faster and more effectively.
capability can then be used to recover
unused software licences, harvesting them
from devices and returning them to the
pool after a set period of time.
Build on existing approaches
One of the notable points about the app store
example above is that it uses tried-andtrusted service management approaches to
drive a better and more responsive experience for the business user.
Service management concepts and
best-practice frameworks such as ITIL and
Lean will not go away, but will remain key
differentiators between shadow IT and the
true corporate IT capability.
Integrating services properly is what
will really add value, and this is precisely
short supply. The need to keep the lights on
in an increasingly complex infrastructure and in which unknown risks are continually
introduced through the back door by
shadow IT - can lead to a blinkered, headsdown approach to work. By introducing
automation in targeted areas, the IT
department can remove the drudgery of
doing the same old activities over and over
again, and gain breathing space to think
about how best to serve the business.
Getting started
As traditional IT departments gear up for
a more innovative future, a good exercise
is to step back and look at an individual
user’s experience across a number of
typical processes.
By identifying where the delays and
difficulties lie, IT staff can then look at easy
integration opportunities that will close the
gaps and make users happier and more
productive.
To support the transition to innovation,
what’s needed is technology that can
interact with, influence and impact the
individual. The ability to define processes
of human and technology interaction
should be a fundamental part of any
solution, rather than a bolt-on option
or afterthought, because agility will not
come if IT departments are over-reliant
on expensive, low-level software code
engineering.
This is not to say that technical skills
will vanish; they will still be required, but
there is an increasing need for technology
capabilities that can move at the speed of
the business. Indeed, this is precisely how
shadow IT has found a foothold in many
organisations.
Today, great technology is easy to deploy
and use so valuable staff with business
and technical skills can spend less time
on tedious low-level tasks and more time
making a tangible difference to the business.
Achieving integration through
automation will depend on developing
comprehensive capabilities across
both service management and systems
management.
Finding a single vendor with proven,
joined-up solutions spanning both worlds
will pay dividends in time and effort versus
deploying siloed solutions. It will also
enable a more comprehensive approach to
innovation, driving greater potential value
for the business.
In the future, integration will not only be
between existing on-premise solutions but
will extend to the more diverse solutions
introduced by shadow IT. Therefore, IT
departments should seek a vendor that
also offers open integration with all leading
technology platforms.
2015 DIGITAL LEADERS
11
BCS, THE CHARTERED INSTITUTE FOR IT
Image: iStockPhoto
DIGITAL MATURITY
OR DELUSION?
Jos Creese MBCS, discusses how the role of IT is changing and suggests that the profession, as a whole,
must keep up with these changes for the sake of business and society in general.
As a practising CIO for over 20 years I have
seen a great deal of change in the IT
profession. As technology has evolved, so
the role of the technology
specialist has also adapted, not only in
learning new tools, but in how they are
used in our organisations and in homes.
There are, these days, more IT
specialists involved in the application and
use of technology rather than in deep
technical design and development.
Yet the public perception of the IT
professional is a continuing stereotype
– that of a technical specialist, probably
writing software code and probably
male and white – basically, a young
‘geeky’ bloke in an anorak. Of course, we
12
DIGITAL LEADERS 2015
do need more technology specialists,
especially in areas such as cyber security
and software coding, but we are also
desparately short of other IT skills,
such as business analysis, project and
programme management, technology
risk management and IT-enabled change
management. These are essential to
support the shift to true digital operation.
Moreover, the misperception of IT as a
male-dominated ‘engineering’ profession
limits the attractiveness of the profession,
especially for women who are hugely
under-represented.
If our profession is to achieve more,
then we have to promote the broader
image of IT. The development of the Skills
Framework for the Information Age (SFIA)
has been crucial in helping this shift, as
it clearly identifies the widest range of
IT functions from the most technical to
the more business-related. We need to
move faster, however, as the debates
hot up around the difference between
‘digital’ and ‘IT’. Our organisations need IT
professionals that can help to ‘re-plumb’
the business, and so to maximise the
potential of technology to improve
competitiveness, customer service,
efficiency, productivity and to drive
business innovation.
Nationally this priority could not be
higher, as our prosperity and social
well-being depends on transforming the
public sector to a lower cost base and
the private sector to be able to compete
in tough global markets. There are some
clear messages about what business,
government and the consumers of
technology expect from the creativity
and inventiveness of our IT professional
specialists.
So what does this mean in practice?
It means that IT professionals need to
understand the difference between ‘being
digital’ and ‘using IT well’. It requires IT
professionals to be able to bridge the gap
between technology opportunity and the
uncompetitive practices that are hard and
expensive to unpick.
It will also cast future IT into the role
of an expensive inhibitor to change.
Studies have shown that in every sector
genuine ‘digital maturity’ drives savings,
productivity, competitive edge, greater
market share and an ability to adapt in the
face of dynamic market conditions. BCS
specialist groups are therefore forging
intelligent links between IT specialisms,
and the application of IT for business
leaders and in digital workstreams are
ensuring that IT is better understood,
Our organisations need IT professionals that
can help to ‘re-plumb’ the business, and so
to maximise the potential of technology to
improve competitiveness, customer service,
efficiency and to drive business innovation.
creativity of our business leaders. And
it means that the IT profession (i.e. BCS)
must have a clear and strong voice in how
this information age revolution should
influence government policy.
So IT professionals, or at least some
of us, need the ‘softer’ skills that are
necessary to perform at the highest
level in all our organisations, influencing
board level members and being central to
business change programmes. Without this
level of involvement in service planning, the
opportunity of technology will be at least
partially missed and more than likely will
be a ‘bolt-on’ to existing working practice.
This is bad news for any organisation,
because it will mean short-term
marginal improvements are made from
new technology, which, over time, will
lock organisations into out-dated and
especially in the step change from old to
new ways of working.
The digital maturity challenges are
hugely exciting for the IT profession,
resulting in a radical re-think from top to
bottom of existing business models and
how technology can be better used and
supported. Whilst IT needs to work closely
with other professions to do this, especially
in finance, human resources, marketing,
procurement, policy and service heads,
we are arguably the only profession that
has a deep enough understanding of how
organisations function and how IT works,
to reshape what organisations do and how
they do it.
There is another challenge for us:
in helping organisations to move from
outdated practices, structures, reward
systems and governance models to a
‘digital’ world, we must ensure that we do
not disenfranchise people or communities.
There are many who still cannot or do not
use technology – because of barriers of
language, geography, cost, disability or just
plain fear of the internet. The digital world
I envisage will embrace universal access ensuring that ‘digital by default’ empowers
everyone and does not depersonalise,
disenfranchise or demand conformity.
It is a world where we support
vulnerable people and isolated
communities much better than we have
ever done before. It is a world where
technology improves access for all and
equality of opportunity, as well as allowing
smaller organisations to prosper in a
competitive world.
Carefully designed digital services
allow easier, better informed and faster
collaboration, which, coupled with
improved customer access, maximises
the take-up of digital services and so the
return from technology investment.
This is a brave new world. It is
one where innovation, creativity and
entrepreneurship are enabled and never
held back by technology. It is one where all
staff, customers, suppliers and partners
are truly empowered and have a voice. It
is one where public and private sectors
transform to be the champion of us all,
ensuring the UK remains a great place to
live, work and to do business.
And we are ahead of the game - the
UK has arguably some of the strongest
platforms for ecommerce in the world and
the innovation across public and private
sectors through adoption of technology is
world-class. All of us, as IT professionals,
need to work with BCS as the Chartered
Institute for IT to champion these changes.
2015 DIGITAL LEADERS
13
BCS, THE CHARTERED INSTITUTE FOR IT
going up one place and mobile computing
dropping two places.
Analysis by number of employees
shows that information security is the
top answer for both SMEs and large
companies (over 250 employees).
Mobile computing is more likely to
be a high priority for larger companies
compared with SMEs (56% versus 45%),
whereas social media is more likely to
be a high priority for SMEs compared
with larger organisations (24% versus
Image: iStockPhoto
WANTS AND NEEDS
Everyone wants a larger budget, but when only eight per cent of participants feel that their organisation has
enough resources and 79 per cent indicate that they need enhanced IT skills among their existing workforce
or additional IT staff, the digital leader has plenty on his plate for 2015. Brian Runciman MBCS reports.
For the fourth year BCS, The Chartered
Institute for IT has run a survey looking
at the needs of the digital leader. And the
faster the development of IT systems, the
more the business views and key problems
stay much the same.
For example, 55 per cent of participants
rate business transformation and
organisational change as among their
organisation’s top three management
issues for the next 12 months. This is
followed by strategy and planning (50%)
and operational efficiencies (48%).
Businesses are clearly seized of
the need for IT to effect change in
their business dealings and internal
organisation. As would be expected, SMEs
14
DIGITAL LEADERS 2015
and corporates have slightly differing
needs: Among SMEs the issue most likely
to be in the top three over the next 12
months is strategy and planning (59%).
For companies with over 250
employees, business transformation and
organisational change (61% versus 40%)
and operational efficiencies (53% versus
36%) are more likely to be high priorities
than for SMEs.
Some issues were mentioned in the
‘free text’ part of the survey that will
undoubtedly become more common
concerns in the medium-term, such as,
regulatory response (which perhaps only
comes onto the radar when legislation is
more immediately looming); and platform
rationalisation.
Specifics: top IT topics
As to specific issues that need to be
addressed, it’s no surprise to see the
greatest number of respondents (60%)
rate information security as among their
organisation’s top three IT topics for the
next 12 months. This was followed closely
by areas that have moved well out of the
‘jargon’ phase and into business critical
applications: cloud computing (55%) and
mobile computing (53%).
These are the same three issues that
were identified in last year’s survey.
However, the order has changed with
information security and cloud computing
is vital in the 21st century, organisations
still need to be looking at why they
would use these technologies, not just
implementing technologies because of the
‘look’ of them.
Skills and resources conundrum
As noted in the introduction, only eight per
cent of participants feel that their
organisation has enough resources to
address the management issues and IT
trends that their company has prioritised.
It’s no surprise to see the greatest number of
respondents (60%) rate information security as
among their organisation’s top three IT topics.
9%) – perhaps a reflection of the shift in
marketing approaches. The next concerns
for the large organisation were big data
(36%) and agile (22%).
Some of the issues much-discussed
in the media are not yet really on the
business radar in a large way, perhaps
indicating their niche market status at
present.
These were the internet of things, with
only 11 per cent representation, and
3D printing with a paltry one per cent.
Other topics mentioned were agile and
operational alignment; robust IT for SMEs;
hosted telephony, general IT reliability
issues; the government digital agenda;
human-centred computing; network
bandwidth growth and new platforms.
One commenter made a valid point
on the range of new services becoming
mainstream, saying that whilst
‘implementing digital culture and practice’
More than half (53%) indicate that they
need enhanced IT skills among their
existing workforce – with the same
number requiring additional suitably
qualified IT staff.
Strangely, activities that would support
the above requirements were rather
fragmented – certainly in terms of relative
importance. Only eight per cent included
‘identifying the capabilities of your IT
professionals’ as a top three priority –
with 14 per cent considering the IT skills
shortage in their top three and 16 per cent
counting performance management in
their top three. Having said that, the survey
suggests that recruitment and retention
is a higher priority for more companies
compared with 2014 (up from 14% to 20%).
Larger organisations are more likely
than SMEs to need additional suitably
qualified IT staff (58% versus 43%).
Other concerns that cropped up,
although in much smaller numbers, were
effective corporate and IT governance;
business change expertise; and, to quote
a comment, ‘better CEOs and other execs
who understand digital tech and its impact
on organisations’ (on which more later).
Business skills and soft skills were also
mentioned.
Biggest IT skills gaps: techy
The IT skills gap section garnered the
biggest ‘free text’ response in the Institute’s
survey. IT leaders care about this subject!
The usual suspects were much in
evidence: IT security, taking advantage of
cloud opportunities, networking, and the
extraction of meaning from big data.
This need came from a smaller
business: ‘Up-to-date knowledge of
suitable SME low-cost apps and software
packages that can be customised and
integrated into our business.’
Concerns came from both ends of
the tech spectrum, from disruptive
technologies to legacy tools and services.
Keeping skills current and identifying the
correct new technologies for the business
to implement is viewed as a big challenge,
and the depth of technical skills came up in
a number of guises.
One comment on a specific need
mentioned ‘automation and orchestration
experience skills’, and delineated that
comment with this rider: ‘true experience:
not just app UI, connectivity experience,
device experience, app experience and data
experience.’
A progressive business need that IT can
assist greatly with was identified by one
commenter: ‘expertise in certain languages
2015 DIGITAL LEADERS
15
BCS, THE CHARTERED INSTITUTE FOR IT
Image: iStockPhoto
such as Brazilian Portuguese, Arabic and
Hebrew’ in their digital products.
The comment from one respondent on
‘technically competent management’ leads
nicely into our next section.
Biggest IT skills gaps: people
The people issues surrounding IT skills are
broad ranging. Let’s start with a refreshingly
honest assessment of those at the top:
‘CEO’s in large enterprises are idiots - they
no (sic) nothing about the processes the IT
department uses for risk acceptance and
design - and are more interested in shiny
things or things that make them look good.
‘They need to think about what they are
doing and stop poncing about pretending to
know what they are doing when it is clear
that they have never worked in IT for real,
ever.’
Some comments on this problem were
more circumspect (helpful?) – summed up
well with the need for ‘understanding of
the business and bridging the gap with the
organisation’s vision,’ and, ‘understanding
of the business objectives driving IT
choices.’
The gap in knowledge is more than just
in management for some. One person said
16
DIGITAL LEADERS 2015
businesses could do without ‘outdated
staff looking back at when they could
operate in “god mode” and dictate to
everyone what they were given - and now
moaning that the world has moved on. IT
departments themselves are in danger of
becoming the biggest blocker on effective
organisational modernisation.’
One comment mentioned the need for:
‘quality managers who understand much
more than simple economics - people
are the key resource to be managed and
encouraged, not beaten into submission.’
Other places where soft and hard skills
may overlap, and mentioned specifically
by commenters as problem areas
included simple experience: commercial
knowledge and experience; the need
for more and better project managers;
implementing agile methods; and fostering
an entrepreneurial culture.
A further wrinkle on the ‘hybrid
managers’ idea involved several
commenters pointing to the importance of
IT people being properly involved in selling
the organisational strategy and getting it
over the line and into delivery.
Some commented on the implications
of outsourcing, asking for, in one case:
‘In-house staff (project coordinators) to
be better capable of querying delivery of
solutions provided by the private sector.’
This respondent notes that skills are lost
to outsourcing and with them an ability to
respond to new technologies.
Specific types of organisation face
particular sets of issues. A public sector
digital leader lamented: ‘As a public sector
organisation it is difficult to recruit suitably
qualified staff due to the limitations on
salary (nationally agreed salary scales).
‘Even training our own staff will not
fully address the issues. This is further
complicated by the organisation seeing
efficiencies of technology adoption without
the necessary investment in the backend
staff to make these efficiencies a reality.’
SMEs faced something a little different,
one person citing ‘silos of knowledge - not
specifically because information sharing
is poor but because the organisation is
small and we have specialists in individual
areas.’
Some larger organisations face related
situations, with some commenting on the
breadth of skills now needed in IT people,
making them look for people who can
play multiple roles in a team and have
knowledge of different technologies to
support business partners.
Here’s a laudable, if tough, goal.
‘We need future proofing abilities - the
provide the requisite infrastructure and
applications.’
This organisation still has needs though:
‘Having a good system architect and IT
Churn is an issue, with some organisations
finding that the capabilities of new recruits are
insufficient to slot into the existing workforce.
service design, to ensure we’re as digitally
savvy as our citizens’
Then there are the problems in actually
finding those with skills in emerging trends
– by definition a small recruitment pool.
This creates a tension when trying to be
innovative, with one digital leader saying
that the organisation suffers from ‘too
much time spent “keeping the lights on”
and not enough time spent innovating.’
organisation is running to stand still at the
moment.’
Churn is an issue, with some
organisations finding that the capabilities
of new recruits are insufficient to slot
into the existing workforce when an
organisation loses existing skilled
personnel. One organisation’s loss is
another’s gain in this scenario, of course,
with those leaving sometimes doing so to
go onto better careers, a by-product of a
competitive workforce market.
One commenter gave an interesting
solution to some of these issues: ‘Our
organisation operates on a lean staffing
model, with skills augmented from
external service providers. The model
envisages the system landscape and
architecture to be designed in-house
and bringing in external suppliers to
The next three to five years
Looking three to five years’ ahead, the
same three main issues are expected to be
at the top of the list of priorities.
However, the order is slightly different,
with strategy and planning coming out top
(46%), followed by operational efficiencies
(44%), and business transformation and
organisational change (42%).
When asked which IT topics will be the
top three priorities in three to five years’
time, information security is again the top
answer with 54 per cent.
This is followed by big data (42%), cloud
computing (40%) and mobile computing
(39%). The information security concern is
the top answer for both SMEs and large
companies.
There are a number of issues which
security experts are the key gaps that
the organisation faces. Supplier (service)
management is a close second.’
Biggest IT skills gaps: hybrid
As implied by the last section sometimes
the skills gaps are not in the organisation
itself but in their suppliers.
Several commenters pointed to the
issues surrounding assessing skills within
their outsourced partners. One respondent
complained that, ‘IT capability is being
delivered by an outsource agent who puts
in sub-standard agents without the skill
sets necessary. This causes major delays
and they incur penalties.’
Another ‘people issue’ is around an
increasingly aware consumer base. A
public sector digital leader said that they
need people with ‘digital awareness in
2015 DIGITAL LEADERS
17
Image: iStockPhoto
BCS, THE CHARTERED INSTITUTE FOR IT
are expected to become a higher priority
in this time frame - the two showing the
highest percentage increase (compared
with plans for the next 12 months) are
succession planning (up from 9% to 17%)
and performance management (up from
16% to 23%).
Compared with the priorities for the next
12 months the IT topic showing the biggest
rise is internet of things (up from 11% to
28%). 3D printing goes up only one per cent
in this context.
Other areas mentioned as being of
medium-term concern were supply chain
integration; embedded wearable security;
other wearables; and predictive analytics.
For SMEs cloud computing is expected
to be the second priority (42%), whereas for
larger companies big data was anticipated
to be the second priority (47%).
People issues seem to be in the category
of ‘we’ll get there eventually’ with the IT
skills shortage, performance management
and recruitment and retention issues
all scoring higher is this future-looking
question than in immediate priorities.
More forward-looking were some of
the technical concerns that may impact
the business. Mentioned specifically
were concerns over the complexity of
virtual environments; polyglot systems
maintenance and migration to a post-Java
script web.
Sleepless nights
The final question BCS posed in this survey
was: When considering upcoming changes
and trends in the IT industry, what is it that
is most likely to keep you awake at night?
Again, the answers were a mix of the
expected, and some thoughtful ideas on the
18
DIGITAL LEADERS 2015
longer term.
As ever, security is at the top... and
came in a variety of guises: information
risk and security; availability; security and
stability in cloud computing and security
breaches.
There were a lot of mentions of topicspecific security issues around, for
example, the internet of things; smart
solutions; the compliance agenda, and
general reputational risk. Zero-day exploits
and the ever-changing nature of security
threats were also mentioned.
Some issues were perceptual, for
example (all commenters’ views):
•
•
•
•
•
•
•
•
•
•
•
The build-up of technical debt by
making wrong product selections;
The illusions of remote access: that
everything is an app that appears to
have no cost;
The speed of competitor change, with
the risk of products becoming outdated;
The possible effects of the revised
data protection legislation in 2017;
The corrosive effect of hype and
nonsense in IT;
Slow change in large organisations,
ticking boxes around superficial
initiatives to comfort senior
management;
Cloud hosting being seen as the
panacea for everything by the
business without necessarily thinking
things through – hence also the
integration of different cloud platforms;
The change in computers caused by
the exascale revolution;
Technically incompetent management;
The burden of legacy;
Responding to change in a large
enterprise environment. Agile works
at small scale, but it is difficult to
scale without it turning back into
compressed waterfall.
Financially based concerns included
(again, from comments):
• Unannounced changes to suppliers/
customers systems;
• Provision of high-speed broadband to
customers for free;
• Cost of software licensing - Microsoft
vs open source.
Positive notes to end on
So, there’s plenty for the digital leader to
do, consider, worry about, look forward
to… and clearly many of them, whilst
concerned about the risks, take an
admirably positive view, one that takes
people rather than just cold technology
into account.
One commenter’s chief concern..? Being
‘people-centric: developing systems that
the public want to use.’
Another points to the importance of the
shift of capability from traditional IT roles
to a more distributed user-oriented model,
whilst maintaining control and governance
over the enterprise architecture.
Here are some final answers to: ‘When
considering upcoming changes and trends
in the IT industry, what is it that is most
likely to keep you awake at night?’
•
•
•
‘Nothing - that’s why I have a CIO!’
‘Not a lot - that is what claret is for.’
‘I’m confident that we are up to the
challenge.’
Some have identified bigger issues. One
commenter laments ‘the complete lack of
human beings involved in recruitment applicant tracking systems have reduced
the quality of the recruitment process to
almost useless.’
And there are ever-present dichotomies,
for example, the combination of the
need for computer security and privacy
protection against the public expectation of
easy and quick access. If this tension leads
to shortcuts, that will cause problems.
One commenter warns about the
government’s desire to integrate and
adopt collaborative data sharing without
full investigation into the consequences in
terms of resources and IT security.
Alarms are also raised at the risks
associated with cloud computing and the
US Government’s stance on data stored on
infrastructure belonging to US companies
- with the associated data protection
nightmare.
One commenter warns of a ‘collapse
of trust in online systems because of
over-complexity and lack of attention to
resilience and security.’
2015 DIGITAL LEADERS
19
Image: Digital Vision/83826961
BUSINESS ANALYSIS
ANALYSIS AS A SERVICE
Analytics-as-a-service (AaaS) can provide on-demand access to an organisation’s predictive and business
intelligence operations. James P. Howard, II MBCS and Scott Beaumont introduce the concepts of AaaS and
provide an architectural overview of this powerful technology.
Analytics-as-a-service deploys predictive
models directly to enterprise as first-class
services on a service-oriented architecture.
Making these models first-class objects
gives a number of advantages.
First, they are decoupled from their
data sources and access data through
abstraction layers. Second, they are
decoupled from their callers and give
the callers an abstract interface for
analytic processing requests. Finally, the
decoupling continues to receivers where
the recommendations of the analytics
service are presented through the
abstraction layer.
These decouplings and abstraction
layers are supported by asynchronous
20
DIGITAL LEADERS 2015
communications through the use of
message queues while the access
abstraction layer is created by the
enterprise service bus (ESB). Pushing
abstraction enables the entire enterprise
to move into cloud-based analytics, taking
analytics and, importantly, decision support
off of individual desktop computers.
Customer-facing systems with access
to high-quality analytics at runtime can
replace critical triage functions with decision
support services through cloud-based
solutions. These solutions will give faster
response times with better accuracy when
analytics are provided as a service. Further,
by focusing on analytics-as-a-service and
decoupling access paradigms, all parts of
the enterprise can access the system.
Architecture
The architecture of a data analytics
platform is here broken down into four
basic components: data storage,
extraction – transformation – load (ETL)
services, analytics services and real-time
analytics services. Other components,
such as ad hoc analytic tools, monitoring
and reporting services, and end-user UI
access should be included, but are left out
here for brevity.
However, these additional components
would depend upon the four core components
described in the rest of this section.
The data store(s) used for holding the
to-be analysed data can range anywhere
from a traditional relational database
management system to a non-relational,
distributed NoSQL database engine.
More complex scenarios can even utilise
a combination of storage engines. The
type of storage engine(s) selected should
depend upon the type and amount of
data that needs to be analysed. Many
enterprises are dealing with the current
data explosion by moving to distributed
NoSQL database systems and using the
cloud to scale their analytics storage and
processing.
The ETL services are responsible for
fetching (and receiving) data, transforming
it to a useful model, and ensuring that is
loaded into storage. ETL services can be
deployed to automatically fetch data at
regular intervals, depending on computing
costs, bandwidth, service level agreements
and so on. This is the traditional approach
to ETL.
However, in a service-oriented
architecture, it is also possible to set
up ETL services that receive data from
publishers, especially internal business
services that produce critical data during
processing. This can provide enterprises
with real-time analytical capabilities. This
communicates with the ETL service. An
ESB or message queue can also provide
reliability, ensuring that the ETL service
eventually receives the message, even if it
was unreachable during execution of the
business process.
The analytics services are the consumer
gateway to the data analytics platform.
Some services might provide access to the
underlying data stored within the platform.
Other services are responsible for
performing specific analysis and returning
the results. Still other types of services
might be responsible for scheduling and
storing results for future reports. In an
SOA, these services can be composed
together or used separately by consumers
to produce meaningful results.
A distinction needs to be made between
long-running analytics services and those
that provide real-time results. Currently,
in-depth processing of huge data sets
takes some amount of time, anywhere
from seconds to minutes to hours, and
maybe even days. It depends on the depth
and amount of data to be processed, the
algorithms used for processing, available
bandwidth and so on. In many cases, this
is acceptable.
However, there is a growing need for
Analytics-as-a-service can improve the
implementation of enterprise-level reporting,
decision support and intelligence.
is where data analytics is required to be a
first-class citizen in the enterprise.
Business services need to be implemented
knowing what, when, where and how: what
data needs to be sent for analysis; at what
points during processing it should be sent;
where to send the data; whether to send it by
ESB, queue, or direct service call; and how to
format it for the service.
Some type of asynchronous approach,
whether a full-fledged ESB or a light-weight
message queue, is usually the best
option. An asynchronous approach will
not tie up the business process while it
in-process analytics; that is, analytics that
can provide an almost immediate response
during some business process. This might
allow an enterprise to detect identity fraud or
misuse of company resources while or even
before it happens, and not after the fact.
Services that provide these capabilities
are identified as real-time analytics
services, but the classification of what
qualifies as ‘real-time’ is really up to the
enterprise.
A service that takes a second or two to
provide a result would be acceptable for
many enterprises, while other enterprises
can allow for more or require less time.
These ‘real-time’ services might meet
their time constraints by using just a
subset of available data or by pre-fetching
and remodeling data for its specific uses.
The point of the ‘real-time’ service is to
provide quick value to some business
process. In the previous example, a full
analysis to detect identity fraud or misuse
of resources should still be conducted
regularly with all available data.
A key way to improve the quality of
decision-support models is to close the
feedback loop. Closing the loop captures
the key features of the predicted case,
along with its outcome, to revise the model.
This can be done in real-time or with
a bulk update, but the important part is
tracking a known outcome. Outcomes, both
positive and negative, serve to strengthen
the predictive quality of the model.
As the model is revised, its predictive
power improves and increases the model’s
efficiency and accuracy. The enterprise’s
decision-support systems improve,
providing better results to the business. As
the predictions improve, so do the results
and the circle is reinforcing. Further,
while the accuracy is improving, so is the
precision, and the narrower predictions
support the enterprise’s business
decisions.
These sorts of self-updating models are
used by technology companies to improve
search results while retailers’ recommended
products use these sorts of models to
connect shoppers to products they are likely
to purchase. These models are revenue
drivers when properly deployed.
Analytics-as-a-service can improve
the implementation of enterpriselevel reporting, decision support and
intelligence.
There are a variety of tools available to
analytics architects to improve deployment,
increase capabilities or meet business
constraints. Implementation is only limited
by the requirements of the service and the
ingenuity of its implementers.
2015 DIGITAL LEADERS
21
BUSINESS ANALYSIS
A different approach is required to avoid
falling into the comfortable trap of trying to
get away with simply ‘keeping the wheels
turning’ or delaying issues in the hope that
they will go away. Complexity needs to be
understood and controlled.
COMPLEXITY CHALLENGE
Image: iStock/451910875
Making effective use of IT always brings complex challenges and changes to organisations, but now there
are not only growing numbers of options available from technology, but also increasing commercial
demands and constraints on the business. Ensuring that IT explicitly meets business needs has never been
more important. Martin Sharp MBCS explains.
IT changes are always challenging, but
increasing complexity means that
companies face greater difficulty in
ensuring that IT delivers the promised
business benefits. Companies and
individuals also struggle to demonstrate
the direct linkages between IT investment
and results.
This linkage is not only highly desirable
for the organisation, it is also important for
the individuals concerned; their value to
their employer will typically be measured
by their proven impact on successful
outcomes, not by effort on work in
progress.
The secret is to understand what drives
successful initiatives can be helped by
considering the following points.
Change imperative
It is often said that the only constant is
change, and projects involving the business
use of IT feel this most keenly.
Not only is there the incessant
undercurrent of shifts and advances in
the technology itself, but there are also
22
DIGITAL LEADERS 2015
internal and external pressures causing
business and process change across the
organisation. Companies alter their focus,
build partnerships, merge and spin off
new structures in the face of changing
economic circumstances, new legislation
and competitive threats.
In IT projects, things should not be
allowed to ‘just happen’, or be delayed or
be overtaken by events. Technology thrives
on change, but even those most closely
involved in technology and innovation
fear it. This is not the right way to achieve
success. Changes must be embraced,
their impact clearly evaluated and driven
through in order to align, meet and deliver
against business goals.
The key imperative for any organisation
faced by change, whether derived from
internal plans and strategy or imposed
by external events, is to ensure that the
process moves rapidly through from
resistance to positive action. This requires
effective leadership and capable project
management.
Project leadership is not the same as
organisational leadership or the process
for managing individuals through their
working career. Although there are
many overlapping skills, effective project
leadership is often most extensively
best employed in the earliest phases of
projects, with an involvement that can
diminish once traditional operational
management take over. For this reason,
most organisations would struggle to fully
utilise this type of skillset on a permanent
basis and thus find external support both
welcome and cost-effective.
Complexity challenge
There are many items adding to the stress
levels of IT projects, and most of them
have little to do with the technology itself.
Far more pressing are the constraints
on skills and resources, and the difficulty
in articulating and linking the IT challenge
to the requirements of the business.
These, coupled with any internal politics,
can often undermine the control necessary
to drive projects through to successful
completion.
Strategic drive
To avoid being overcome by the fear of
driving through change or the typical
stresses facing IT projects in all
organisations, a clear path needs to be
set out. This is not always easy for most
organisations to accomplish when
everyone is caught up in the day-to-day
operational requirements.
Many are put off by the perception that
setting out a clear strategy will involve too
much extra effort or will somehow tie them
down unnecessarily, and so they continue
in an ad hoc manner making reactive
tactical decisions as and when required.
In many areas of IT, the constant
innovation and change are also often cited
as reasons for not wanting to be pinned
down to a particular approach.
This might seem as if effort and
resources are being saved or protected,
To avoid the pitfalls of IT being labelled
‘costly and unnecessary’ the links between
strategic imperatives for the business
and realities of project delivery have to be
clearly established. IT needs a strategic
architecture that integrates business
needs and processes with technology
requirements and deliverables.
Structured delivery
For IT projects it is no longer sufficient
to measure success based on ‘in time, in
budget’ (as if they were all achieving this
in any event). There has to be a direct and
credible connection between investment
and results. This requires an approach that
combines a number of elements within
frameworks that can be employed and
adapted to suit different types of projects.
There are many rigorous methods
and methodologies available, but slavish
adherence to any one is not likely to be
sufficiently all embracing or flexible to
lead to success. Project delivery needs to
be focused, well structured and carefully
managed through an experienced blend of
well-proven best practices from business
and IT methodologies.
A different approach is required to avoid falling
into the comfortable trap of trying to get away
with simply ‘keeping the wheels turning’.
but the reality is that this approach will
only lead to bigger issues over time.
Only with a well thought out strategic
approach and the right architecture in place
will IT retain both the flexibility and the
structure required to demonstrate how the
strategy meets business aspirations, fits
operational processes and all within the
constraints of often limited IT resources.
IT benefits are often misunderstood
by those in control of the direction of the
business or managing finances. They are
also often badly represented by those who
are too close to the problem.
Benefit clarity
Many people talk about benefits when they
are really referring to improvements or
advantages, but these are not things that
have clear and tangible business value.
Two primary axes can be used for
outlining benefits, which can be categorised
into ‘financial versus operational’ and
‘increasing versus decreasing’. Essentially
something can increase value add or
flexibility, or it can reduce cost or risk.
Results need to have recognisable and
demonstrable benefits for the organisation,
its competitive position and the individuals
involved. Critically the impact of investment
in technology, systems and architectures
must be explicitly linked to the actual
benefits delivered. This traceability ensures
that the actual and complete return on
investment and contributions of those
involved is well understood.
Delivering real benefits
IT projects can often appear to be managed
successfully to completion and yet no one
is really happy.
Work has been done, resources deployed
and budgets met, but without the firm
connections of accomplishment to specific
benefits for the business, the value will not
be recognised, and in many cases probably
has not really been properly delivered.
It is no wonder that those who make
financial commitments and decisions often
deride IT projects as either too expensive
or unnecessary.
It does not need to be this way
Organisations can deliver changes today
that provide real business benefits, but this
process must be driven by business need
and with full understanding of the elements
involved, by both IT and the business.
By taking a systematic and strategic
approach, clear, understandable and
relevant benefits can be delivered to the
business as it stands today, or by what can
be achieved for tomorrow as:
•
•
•
•
rocess effectiveness;
p
operational efficiency;
automation efficiency;
opportunity creation.
IT and business projects can deliver
immediate and relevant benefits, but only
if changes are driven, the organisation is
internally correctly aligned and a strategic,
clear and well-understood architecture
is in place. The imperative for IT is to
embrace this approach and work with
those who know how to make it a reality.
2015 DIGITAL LEADERS
23
Image: iStock/496879635/157749090
BUSINESS ANALYSIS
SHARKS AND ICE CREAM
David Geere MBCS ponders the nature of doing business in the modern world and asks: is your business
intelligent?
If you look at the biggest companies in the
world, they have very different data
characteristics. Some are based on
massive data processing and analysis
capabilities (online search and advertising
being most obvious.)
But consider an upmarket hotel chain.
Having great data is one thing, but no-one
on reception to check you in, a lumpy pillow
and an empty mini-bar are business killers,
irrespective of having a real-time report,
globally mapping every un-made bed.
Many major corporations were built
up from a single store. Their founders
spent their time on products, people and
customers. Their business lives depended
on getting it right, fine-tuning and selling
24
DIGITAL LEADERS 2015
the old-fashioned way; then repeating. But
that was then, and this is now.
So how can businesses today, with the
opportunity to record and analyse every
data point in real-time, support but not
lose sight of their real mission, namely, to
serve, delight and retain the customer in a
profitable way?
Management information (MI), business
intelligence (BI), enterprise performance
management (EPM), insight, analytics,
data science, enterprise data management
(EDM), big data - call it what you will these are all names for collecting, storing
and analysing information.
Key to this kind of analysis is crosscomparing something with something
else. You could cross-compare one
internal system against another: did
sales change because all my staff went
on holiday? Or you could cross-compare
one system against external data such as:
did all my staff go on holiday because the
weather was really great?
Both analyses would be considered
backward looking. Looking back can help
to understand why things occurred, and
hopefully to stop bad things next time (or
allow the good things to flourish.)
However, with some thought, experience
and possibly help from a computer you
can forecast the future. Selling more when
it rains (umbrella anyone?) or struggling to
get products to stores due to busy roads
(bank holiday weekend) should influence
how the business is run. Forward-looking
analysis can be tricky, the further into the
future you go, the harder it can be.
Be careful which variables you link
together to make a decision. Fact: the
more ice creams sold, the more shark
attacks there will be. So to stop shark
attacks, simply stop selling ice creams? Of
course not; the reason for both ice creams
and sharks is probably down to hotter
weather.
It’s a brave CEO that lets computers
start making data-driven decisions. It
might be the norm for high-frequency
share trading, but do we really want to
let a computer change prices, change the
in-store lighting or order 10 tons of beans?
There are obvious challenges. We need
good data so that decisions are based on
accurate, timely and complete information.
Every company has the usual data quality
challenges, master data challenges, a lack
information too widely available is bad
news, but making sure people know what
is happening is important.
If they don’t know the results of their
efforts, they can’t know if they’re doing it
better than before. All businesses should
listen to the people on the front line; it’s
they who are sometimes the voice of the
customer. Some customers want to stay
anonymous. Let them.
The price of data-related technology
has gradually reduced. Disk space and
memory has never been cheaper. So it’s
now feasible to load in and analyse pretty
much everything, from structured database
files and market research to unstructured
video clips, social media and hand written
contracts.
In the old days you had to take a
subset and make assumptions that it was
representative of the whole. Those days
are gone. Now you can find the niche,
make things localised, tailored and specific
Fact: the more ice creams sold, the more
shark attacks there will be. So to stop shark
attacks, simply stop selling ice creams?
of training and governance to some degree.
The best companies, however, spend
time on these challenges and make it part
of the culture to do so. Data quality will
never be perfect, but going the extra mile
to help people to get it right (and ensure
there are less opportunities to get it
wrong) typically bears fruit.
Greater transparency?
Employees in general are very important,
whether your organisation is a heavily
data-driven organisation or not. Giving
away the company silver by making
to an individual customer.
Depending on your point of view,
standards and knowledge have gone up,
but data loss is more widely reported and
your data is in a system connected to the
internet (through firewalls if you’re smart,
but so are the hackers). The cloud may be
safer, faster and cheaper than your own
data centre, but do the sums add up? Think
carefully, decisions can be hard to reverse.
And how about gut instinct, using a
pair of eyes, and just listening? Many
mechanisms for running a business are
still the same as they’ve always been; give
the customer what they want, in a better
way than everyone else, and develop your
people (profitably).
Customers are becoming more
demanding, savvy and price conscious.
The best businesses know their customers
and what they want and need. The story
is relevant for public and third sectors
as well; these organisations still have
customers, who still expect good service;
it’s just paid for in a different way.
The explosion of social media means
your customers are telling the world about
your business, where before it was just
their friends and colleagues. But this kind
of interaction can also make it easier to
find out what they want. You don’t need a
big expensive computer to crunch all this
data, but it can make a big difference if
done well.
Most online businesses have grown up
recording things that traditional firms will
take years to catch up with. For some it
will be a struggle to capture everything
needed to get the right answer. Deciding
what to measure rather than what data is
available is key - the top-down approach.
If data is not available, innovate. Use proxy
data or collect and track in a different way.
Ensure there is a way to get analysis into
the hands of decision-makers, otherwise
what’s the point?
In the best companies, business
intelligence is part of the culture. In
companies that have intelligence,
employees know how their piece of the
jigsaw is doing, what the customer likes
and doesn’t like, why things are what
they are and what they can do to improve
things. They know quickly when something
is going well or badly and they act. Even if
it’s a computer that told them and makes
the change.
2015 DIGITAL LEADERS
25
BUSINESS ANALYSIS
ENABLING
SUCCESS
Image: iStock/451910875
Reports of ‘failed’ or ‘challenged’ information systems (IS) projects are all too common. We have adopted
initiatives such as outsourcing, decentralisation and centralisation to try to improve things. We have seen
the introduction of new roles such as enterprise architects and new methods such as Scrum. And yet,
the failures just seem to keep coming - only with ever-increasing price tags. With this in mind Debra Paul
FBCS asks what are the problems we really need to address.
On 18 September 2013 the Financial Times
summed up the situation as follows:
‘There are several ways the US Air Force
could have wasted $1.1bn. It could have
poured tomato ketchup into 250m gallons
of jet fuel or bought a sizeable stake in
Bear Stearns. Instead it upgraded its IT
systems.’
Over the last few decades, those of
us working in the information systems
industry have tried various approaches
and means to improve the chance
of delivering successful IS projects.
These have ranged from adopting
formal analysis and design methods to
increasing the level of collaboration with
customers. Studies looking at the causes
of problems often focus on issues related
to governance, lack of user involvement
and poor requirements. While governance
is concerned with what Drucker called
‘doing things right’, ensuring that we are
‘doing the right things’ is also extremely
important. Requirements are not poor
per se, but they are often subjected to
limited analysis or are documented poorly.
Equally, they may not be elicited with care,
not relate to the business needs or fail to
address the issue to be resolved. However,
in some situations, it is not possible to
address the real issues because there has
not been any attempt to determine what
these issues actually are. Answering the
26
DIGITAL LEADERS 2015
question: ‘what problem are we trying to
address here?’ is critical for a successful
IS project, but somehow all too often this
is missed.
Requirements are defined in order
to ensure that the business operates
efficiently. They are related to business
objectives and critical success factors
(CSFs). They are a response to a
recognised and understood ‘root cause’ of
a problem. And, to really understand why
requirements are not correct or are poorly
described, we need to adopt a business
analysis approach, investigating the root
causes of poor requirements so that we
can overcome them and ensure business
requirements are understood before we
leap into adopting a ‘solution’.
Too often, change programmes are
initiated because someone (typically at an
executive level) has a ‘great idea’. Perhaps
they have had a discussion about a new
software package with an acquaintance
who works for a competitor or partner
organisation, or they have heard about a
terrific new technological development,
or have picked up a book extolling the
virtues of a new business model. But this
does not mean that the idea will work
in every organisation and, even more
critically, it focuses on the solution before
understanding the problem to be resolved.
If we do not understand the problem
to be addressed or the opportunity to
be grasped, we have a fatal flaw in
the proposition. In other words, if the
justification for the project is solely that
someone else has tried it, then it is almost
certain to fail. Similarly, if an idea is not
evaluated for its potential impact, then
the opportunity exists for highly disruptive
– perhaps destructive – results. At best,
the resultant change programme will be
‘challenging’, at worst, we will be looking
at failure.
In our private lives, would we embark
on a significant purchase without (a)
understanding why and (b) thinking
through the outcomes we would want?
I would hope most of us would answer
with a resounding ‘no’. And yet we hear of
projects that are set up with the vaguest
of intentions. I once asked a business
manager what he would like to achieve
through a proposed project. His answer
was ‘increased profit’ and no amount
of further discussion could gain any
clarification. He had seen a competitor
use a similar solution so wanted it in his
company irrespective of how it would
meet any business objectives, help achieve
defined CSFs or impact the existing
people, processes and systems.
Business analysis is utilised differently
in different organisations. Many
organisations have recognised the value
that business analysts bring and have
engaged in the ongoing development
of the BA practice, thus ensuring that
projects address real needs. However,
some organisations do not appreciate the
value of business analysis and even if a BA
Figure 1. T-shaped BA professional
Business knowledge
Interpersonal skills
Generic IT knowledge
Business
analysis
skills and
knowledge
practice exists, it is a ‘hidden gem’, offering
significant capability that is under-utilised.
As a specialist discipline, business
analysis ensures that organisations
don’t jump to hasty solutions and don’t
waste the vast amounts of funds spent
on the all-too-often publicised project
failures. The key factor here, though, is
the word ‘specialist’. Business analysis is
conducted by professionals with a toolkit
of techniques and competence in a range
of skills. They understand the business
domain and the stakeholders with
whom they are concerned to engage and
communicate, and to whom they provide
support. But these skills are held by many
IS professionals so what makes business
analysts the right people to investigate
ideas and problems, and identify and
help evaluate the potential solutions? The
answer may be seen in the ‘T-shaped
professional’ model for business analysts
shown in figure 1.The horizontal bar in
figure 1 shows the generic business and
IT skills required of IS professionals and
the vertical bar sets out the ‘deep’ areas
of specialist BA competence, which here
relate to key business analysis skills such
as problem investigation, situation analysis,
requirements engineering and solution
evaluation.
Specialist business analysts require this
skill set in order to be able to conduct the
range of activities included in business
analysis work. These activities are shown
in the BA process model in figure 2, which
sets out clearly the responsibilities of a
business analyst and the strategic context
within which they are constrained and
performed.
So, if business analysts are ‘T-shaped
professionals’, with the skills and
knowledge required to conduct the
activities shown in the BA process model,
why is it that, if the analytical work is
conducted, it is often undertaken by other
professionals (with similarly extensive
skills, but in different ‘deep’ areas)? This is
a recipe for hasty judgements on solutions,
a failure to consider tacit knowledge, a
tendency to accept assumptions and a lack
of focus on root causes. And, if a change
project begins with this inadequate level of
analysis, it is not too hyperbolic to say it is
doomed.
Ultimately, we all want the same
thing – to ensure our organisations work
effectively by deploying the best systems
and processes to support the work
performed by skilled people. To achieve
this, we have to think before acting, which
means we investigate and analyse before
committing funds to unwise investments.
Business analysis is a hidden gem in too
many organisations. It is time to raise its
profile so that IS projects have a better
chance of delivering business success.
Figure 2: Business Analysis Process Model
Business context
Investigate
situation
Consider
Perspective
Analyse
needs
Evaluate
options
Define
requirements
2015 DIGITAL LEADERS
27
COMMUNICATIONS
W H AT I S
COLLABORATION?
Image: iStock/153967692
Philippa Hale MBCS, Director and Senior Consultant at Open Limits, asks: what is collaboration, what stops
it and how can we make it happen?
Collaboration across diverse teams and
between levels of the hierarchy remain the
twin unconquered peaks for many
organisations. It is often the fatal flaw
uncovered in corporate disasters, and the
silent, relentless drain on cash,
productivity and talent of less public but far
more numerous organisational
restructures and transformations.
The following shows how teams from
three very different organisations identified
and overcame barriers to collaboration. In
one case the teams were specialists within
the same large IT function, responsible for
different steps in the service delivery process
managed in different countries.
The other teams were from different
functions including: finance, legal, sales,
marketing, HR and IT.
At its simplest collaboration can be
defined as: ‘To work with another person
or group to achieve something’. Initially the
teams thought of collaboration in terms of:
•
the tools: the technology and media
for accessing and sharing documents
and applications, tracking progress,
gathering data for decision-making,
and following processes.
• the location: in some cases the teams
worked remotely, across sites,
countries and continents. In others
they were on different floors of the
same building.
However, all agreed that the real heart of
collaboration was not just working
alongside each other to deliver products
and services; there was a creative, more
intimate, proactive element and more
in-depth on-going knowledge sharing,
learning and debate.
28
DIGITAL LEADERS 2015
Stories of good collaboration included
doing interesting, challenging work,
discovering a whole new side to people,
making a difference and being recognised
for it. Poor collaboration led to deep
frustrations and anger over what were seen
as avoidable blocks by individuals, teams
and management.
Where these had been left unchecked, the
stronger emotions had dulled to cynicism,
small barbs of passive-aggressive behaviour
such as not turning up to meetings or going
against decisions made, indifference to new
initiatives and doing the minimum.
What stops collaboration happening?
Human beings, it seems from looking at
any news media on any given day, are
socially and psychologically programmed
to stick to and to defend their own.
Collaboration is also a natural human
behaviour, but one that requires a degree
of maturity, awareness of self and others,
positive perseverance in the face of others’
reluctance and an environment where it is
safe to explore the new and unfamiliar.
Goffee & Jones’ Why Should Anyone
Be Led By You? (2006) and Kotter’s
Accelerate Change (2014) discuss the
in-built systemic loops that discourage
collaboration. It takes a resilient individual
or team to question their own and others’
habits, behaviours and thinking.
The danger when senior management
talk about collaboration is that they refer
to best practice principles and thinking,
which, while they make perfect sense, do
not connect with the day-to-day experience
of team members and managers ‘on the
ground’. In each of these three examples,
senior management encouraged teams to
first get some perspective, then address the
details that mattered to them.
Proceeding sensitively was found to be
important, as there were clearly areas of
rawness around attitudes and perceptions,
behaviour and performance. The teams
included speakers of English as a first
and second (or third…) language from all
continents.
Three barriers identified by the groups +
solutions explored
The teams identified three barriers, and
possible solutions to overcoming them. The
highlighted barriers were the top priorities
because:
a) everyone could take action and benefit
immediately;
b) improving these areas would enable
more in-depth collaboration in other areas.
Solutions explored
The groups initially blamed factors outside
their control. These included frustrations
around (perceived or real) poor planning
and prioritisation passed down the
hierarchy, skills gaps, bottlenecks,
misaligned processes and senior
managers using unhelpful language
themselves. However, when their focus
shifted to what practical steps were
possible, the groups started to feel more
positive and more willing to take on some
responsibility for finding solutions. These
included, for example: asking for a
Barrier 1 - Emails
The phenomenon of email ‘flaming’ is
commonly recognised. When stepping
back and analysing the specific language
in their emails, the groups were quite
shocked by the tone of some emails. Both
managers and team members commented
that they had become desensitised.
Comments included: ‘It’s not nice, but they
are always like this so we try not to let it
get to us’.
Given that emails were the only actual
words exchanged between some teams on a
regular basis, this was critical. The language
ranged from the unclear, incomplete and
unhelpful, to the frankly abusive. Plus, there
was limited understanding of the damage
that a frustrated ‘cc’ escalation could
cause, particularly in cultures with more
hierarchical, high ‘power distance’ (Hofstede)
relationships.
meeting, picking up the phone and asking
questions.
After discussing the seven areas of waste
identified in ‘Lean’ process reviews, one
team identified ‘waiting’ for action from
those interdependent teams as an area to
work on. By using the ‘neutral’ vocabulary
of the lean thinking, they could name their
concerns and offer practical suggestions
more comfortably.
included the use of ‘you should …’, which
sounds like a command to a British reader
but from a German translated as ‘May I
suggest that you …’ – so the intention was
to be collaborative!
Solutions explored
Discussing these language aspects together
was extremely helpful in relationship building.
All parties were keen to learn how they
were perceived and what they could do to
help understanding. For native speakers,
slowing down – considerably – was key, as
was not using local expressions.
Groups created their own ‘meetings from hell’
checklist as a light-hearted way to highlight
better practices for face-to-face meetings.
Barrier 2 – International English
There were many examples in all the
groups of language-based
misunderstandings, where second/third
language English speakers had done their
best to articulate their needs, and the
native speakers, perhaps having never
experienced working in a second
language, took the words used at face
value. Some examples of ‘false friends’
Other things included keeping sentences
short and not using any waffle or
ambiguous management jargon. Plain
English actually sounds more professional
and authentic, but many native and
non-native speakers believe otherwise.
Groups created their own ‘meetings
from hell’ checklist as a light-hearted way
to highlight better practices for face-to-face
meetings and video/audio conferencing and
shared this with their bosses.
Barrier 3 – Prejudice
Having never met face-to-face in some
cases, and with nothing but a few words in
emails and general media images to inform
their judgements, the teams had created
surprisingly detailed pictures of the
intentions, level of intelligence, technical
competence, work ethic and values of the
other groups, or even of people in the same
department.
Suggestions explored
One team invited the other party to work
with them on highlighting and addressing
issues together, one at a time. ‘The whole
solution in the room’ was a phrase used.
Another turned process mapping into
a shared, physical and visual activity, with
giant post-its, a wall and marker pens.
This filled many gaps in understanding
and increased appreciation of each other’s
knowledge, context and constraints.
In one team, where intercultural training
was not an option, members volunteered
to research one of the countries they were
working with and present their findings.
This included contacting their local native
speaker colleagues and asking for their input.
They found this fun, fascinating and a great
icebreaker.
Results?
The changes in mood, attitudes and
behaviour in each of the teams were
quicker and more significant than expected.
Within three months, there were multiple
examples of small improvements in
collaboration with significant impacts in
delivery.
Actively spending time reviewing
successes and small improvements
reinforced the shared sense of
achievement. Six months on, internal and
customer relationships and delivery had
improved in all cases.
Importantly, in all three cases, a senior
manager got involved, either at the start or
when asked to support the initiatives being
taken.
Collaboration breeds more collaboration!
2015 DIGITAL LEADERS
29
COMMUNICATIONS
the first mobile experience a user is likely
to have with a company is through the
browser; if this is poor then how likely is it
that the user will download the company’s
app?
In order to get the mobile browser
experience to an acceptable standard
consideration will have to be given to
responsive web design. In its simplest
form this takes the information received
from the browser and rearranges the
content to match the screen it is displayed
on. It is, however, seldom good enough
to simply change the way content is
displayed. Consideration has to be given
to how the content is interacted with. The
mobile site needs to be optimised for
touch instead of click.
The starting point for this is to make
elements big enough to be touched, but
further to this you need to ensure that
touch events are triggered appropriately
to avoid the delay associated with click
Image: iStock/180956427
MOBILE:
MORE THAN APPS
Richard Phillips MBCS, a senior technical architect, discusses mobile applications in an enterprise environment.
In the dying days of the last millennium the
buzz in boardrooms of large companies
was around achieving a web presence and
not getting left behind in the dot com boom.
With web presence now mostly established
in all but the most technophobic of
companies, the chatter is mostly around
the mobile space.
Just like the early days of the web, the
mobile world is a volatile environment and
establishing a mobile presence is not as
easy as many would have you believe. For
the hobbyist developing apps in their spare
time failure just means that you move onto
the next idea, but for a large enterprise
failure could mean spiralling costs, brand
damage or security breaches.
For many companies mobile means
just one thing: ‘We need an app!’ Apps are
30
DIGITAL LEADERS 2015
certainly one solution but they are by no
means the only one and, in many cases,
not even the best solution.
The first port of call for anyone
developing enterprise applications for
the desktop is the browser and the same
should be the case for mobile.
With the support for HTML5 by most
modern mobile browsers you have the
ability to develop rich internet applications
that can also interact with many of the
capabilities of a mobile device.
If, for example, you want to use the
device’s GPS sensor to find the current
location, use the compass to determine
which direction you are facing and then
use the camera to capture an image; this
can all be done from a browser. As HTML5
evolves we will see support for more
capabilities; for example, support for the
near field communication sensor is just
around the corner, with other capabilities
likely to follow.
Even if the thing you want to do cannot
be done in a mobile browser, the first
step in your mobile journey should still
start with the browser. There are many
companies who have mature mobile
offerings in the app space, but accessing
their website on a mobile device takes you
to the desktop-optimised site rather than
one optimised for mobile.
As an end user I might get a slightly
better user experience from the app but
as a company I might have invested a lot
of money and resources in an app that
could have been more cheaply and easily
delivered in a browser. Alongside this,
For example, as a banking customer it is
likely that I may want to quickly check my
current account balance or transfer money
from my savings account on the move, but
probably less likely that I want to apply
for a new bank account. Alternatively as a
banking employee I may want to be able
to start a bank account application with a
customer on my mobile device in the foyer
of the branch and complete it later in the
back office.
When you know what you and the end
user want the app to do the next problem
is choosing the platforms that you want to
support. Data from usage analytics of the
mobile web will give a pointer to which
operating systems and devices are most
popular. It is likely that the app will need to
be developed for at least two platforms.
When you develop a website that needs
to support more than one browser it is
likely that there may be tweaks here and
there in the code but it is usually nothing
Testing a mobile app should be done in the
real-world on real mobile networks on tens of
different devices.
events. Often making the same site work
equally well on mobile and desktop is a
step too far and the easier option is to
route mobiles to a separate site where you
can give a truly touch-optimised mobile
experience.
When considering creating an app the
lessons learned from the mobile web user
experience and the data collected will
contribute to its success. A good mobile
optimised website can be used to get end
user feedback on features they would
like to see and usage analytics on what
devices they are using.
This brings us to the main questions
that should be asked: what do people
want? An obvious use case of mobile is
to enable people to interact whilst on the
move, so it is important to concentrate
on the aspects of your business that lend
themselves to a mobile customer.
very significant.
With an app the story is quite a different
one, as it has to be developed separately
for each operating system. Recent
innovations in developing cross-platform
frameworks have meant that it is now
possible to share code between apps,
but for some types of app code must be
written specifically for each platform.
Maintaining multiple versions for
different operating systems also has a
testing overhead. This is then exacerbated
further by the number of devices and the
situations in which these devices are used.
Testing a web app usually means sitting
behind a desk running the app on maybe
three or four browsers on a corporate
test network. Testing a mobile app should
be done in the real-world on real mobile
networks on tens of different devices.
For many businesses a mobile app
is the first time they actually distribute a
piece of software to the general public.
Fortunately the various app stores have
made the distribution bit relatively easy,
but there are several other aspects
that need to be considered. With a web
page most of the true code is protected
on application servers, safe inside the
corporate network; an app however has the
code on the device itself. In the worst case
scenario code is interpreted at run time so
held in plain text, but even compiled code
can be de-compiled and converted to plain
text. This obviously exposes a vulnerability
to the corporate network. Obfuscating the
code can go some way to protecting the
code but can never protect 100 per cent.
Other issues can arise from not being
able to control whether a customer
upgrades to the new version of the app.
A company can always guarantee that,
if someone is accessing a website, they
are accessing the latest, hopefully bug-free
version of the site; this is not the case with
an app where the customer could be using
an older version of the app containing bugs
or even security vulnerabilities.
It is therefore essential that apps are
given versions and any access back to the
corporate network has the ability to allow
or disallow versions of the app. There
are many more security vulnerabilities
associated with apps, which would merit a
textbook on their own.
If nothing else this brief article has
hopefully put across the point that going
mobile is not just about apps and that
going down the app route is not always an
easy option. Another point to remember is
that unlike some other technologies, the
devices and operating systems customers
are using today will be obsolete in maybe
as little as two years. So a project to
deliver mobile functionality today cannot
be thought of as one that ends in maybe
three to six months; development of a good
mobile application never ends, rather it
continuously evolves just as the devices it
runs on does.
2015 DIGITAL LEADERS
31
COMMUNICATIONS
COMMUNICATION PITFALLS
Image: iStock/180956427
Communication is a cornerstone of a great strategy; done well it will deliver the desired message and reduce
any anxiety that may exist; done poorly and resistance to change will increase and performance may be affected.
Justin York MBCS highlights some pitfalls, problems and solutions all too common in communication.
Corporate communication brings a number
of problems, and it is necessary to
understand the type of issues that you may
come across:
Poor response: this refers to a lacklustre
response or lack of engagement with a
process (usually of change).
Resistance: the actions of people
responding to the change do not reflect
acceptance, regardless of the initial
communication response.
Rumours: employees, when faced with
significant change, often start rumours
about what they think is going to happen;
the so-called ‘reading between the lines’!
Open hostility: hostile behaviour
affecting the teams or organisation needs
Preference
Visual
Auditory
Kinaesthetic
32
DIGITAL LEADERS 2015
to be addressed quickly and effectively.
Such open hostility is bad for morale,
motivation and the business as a whole.
Reduced performance: over time a
reduction in performance from a team
member or whole team is noticed.
The severity of the above issues will
vary between organisations, how adaptive
the people are to change and how much
change is going on at any one time.
Communication preferences
People have their own preference for
receiving information and processing it;
they typically fall into one of three
preferences aligned with our senses:
Visual: around 60 per cent of the
population are visually-orientated people
and like pictures or images. They like to
see drawings, colleagues to draw concepts
or images that they can relate to.
Auditory: approximately 20 per cent of
the population prefer auditory stimulation
and messages, generally preferring their
communication to be through the spoken
and written word.
Kinaesthetic: the remaining 20 per
cent of the population prefer the thinking
/ feeling approach to understanding
communication and processing
information. They are the hands-on people
of an organisation and would very much
prefer a walkthrough of a concept as
opposed to pictures and words.
Likes
Images
Drawings being done at
the time
Words/language
Dislikes
Bullet after bullet
Continuous words
Images / words
associated to feelings
Just images or words
Images / drawings
Actions
Will be looking at the
screen or image being
drawn
Will be looking from
screen to speaker
Will be looking from
screen to speaker then
neutral
When thinking about preferences of
the audience, it is essential to mix the
language of presentations and briefings
so that everyone feels that they have been
included in the process. The delivery of
presentations should therefore include
words and images, which can also
induce a feeling of connectedness for the
kinaesthetic in the audience.
This table is a generalist view of the
preferences; remember there will be a
blend of preference as it is highly unusual
to be 100 per cent one type.
Communication strategy
Communication needs to be clear and to
the point, styled in a way that suits everyone;
however you do need to ensure that you
can pass the same message to all levels
with subtly different techniques. It is
essential that you work out who the
communication is for and what they expect
from it.
Once you have decided what your approach
will be, then you can produce the material that
will back the strategy. This may contain
PowerPoint presentations, verbal briefings,
written messages such as letters or emails
and poster-type material. By making the
delivery of the strategy as diverse as
possible the maximum number of people
can be engaged.
Delivery
Delivery of communication must be
congruent; body language, tone and pitch
and the words all need to be in alignment.
Research that states 55 per cent is body
language, 38 per cent is the tone and pitch
of delivery and seven per cent are the
actual words.
The audience will believe incongruent
communication less. This may lead to
reading between the lines and an attitude
of resistance. Imagine, a presenter who is
delivering some important communication
Know who it is you are communicating to and
produce the appropriate level of material.
The key aspect of this development
of a communication that encompasses
all is not to lose the thread of the overall
message and to make sure it remains the
same.
A good communication strategy has to
contain:
• What is it you’re trying to communicate
and how will you do it?
• Who are the audience, (CEO, the
board, senior managers, teams or
individuals)?
• What is the purpose of the
communication, (new strategy, merger,
redundancy etc.)?
• Who will be communicating the
message, and at what level?
• What are you trying to achieve with
the message?
• What response are you looking for?
about change, but their posture is poor;
tone and pitch are very monotone even
when they are making a positive statement.
That scenario does not deliver a congruent
message to the audience. Conversely
imagine a presenter who has great
posture, is animated, the tone and pitch are
variable as they deliver a positive message;
basically, the greater the congruency the
less the chance of misunderstanding.
Benefits
All individuals will undergo an emotional
response when confronted with change
and often how this change is
communicated effects how it is accepted or
rejected. The emotional responses felt by
individuals are real and require some
managing, particularly if the change will
affect their role. Addressing concerns
ahead of a more formal corporate
communication will help to reduce the
impact.
The 3E’s model asserts that individuals
will have an expectation about what
they are being told and they will have an
experience (good or bad) that will result in
an emotional response. If the expectation is
met through experience, then the emotion
will be positive and the expectations will
be reinforced; if the expectation is not met
as a result of experience then the emotion
will be negative and expectations will be
‘dashed’. The key assertion is that the
communicators have the power to meet
or deny expectation (assuming they know
the expectations) through the delivered
experience.
Reducing resistance to change to
an absolute minimum will reduce
management overheads allowing more
time to drive the changes. It is essential
that the message is further reinforced,
through objectives and task assignment.
This article has discussed some of
the fundamental issues that surround
communication of strategy in organisations.
These can be summarised as:
• Know who it is you are communicating
to and produce the appropriate level of
material.
• Mix the delivery medium (visual, auditory
and kinaesthetic) to engage as many
of the people as possible.
• Make your strategy clear and concise
to reduce the amount of resistance
that people will have to the change.
• Show what’s in it for them to engage
them further and gain their support.
• Deliver your communication in a way
that minimises reading between the
lines; if you feel you can’t say
something then don’t.
• You cannot ‘not communicate’; body
language makes up 55 per cent of
what we say; when you’re
communicating make sure your body
language, tone and pitch and words are
congruent.
• Deliver your message like you mean it
and believe in it.
2015 DIGITAL LEADERS
33
CLOUD COMPUTING AND DATA MANAGEMENT
Image: iStock/477440389
lose some of the accuracy of how data is
presented and when to choose a bullet
chart over a spine chart requires expertise
and experience.
Data should be presented alongside an
QUALITY DATA
It is estimated that by 2020 there
will be nearly as many digital bits
in the world as there are stars
in the universe, some 44 trillion
gigabytes!1 Therefore, is it any
wonder why C-level executives
are hungry to harness the power
of business intelligence (BI) to
make sense of their companies’
data and gain a competitive
advantage? Lee Warren makes the
case for good data visualisation.
34
DIGITAL LEADERS 2015
‘Data! Data! Data! I can’t make bricks
without clay!’ These words, uttered by
Sir Arthur Conan Doyle (author of the
Sherlock Holmes books) are truer today
than ever; even at its most granular level
data is binary, objective and concrete; data
don’t offer an opinion or come in different
shades of the truth.
However, actually delivering information
is only effective and will only deliver true
unequivocal insight if it’s easily grasped
by its audience without ambiguity. In my
experience, too often the emphasis is
placed on data discovery, resulting in the
correct method of visualising the data being
neglected.
In practice and as is well known, decision
makers are deluged with information and
squeezed for time. It’s also well known
that what those decision makers need is
at-a-glance data to inform those decisions.
However, what isn’t as well known or
understood is how a poorly presented
piece of key information can completely
undermine its currency and impact.
Examples could be poor choice of
colours, inconsistent sizing or misalignment
of graphics, all of which affect accessibility
and make a dashboard something people
are less inclined to look at. Additionally,
whilst it is hoped that consideration is given
to choosing the right chart for each metric,
this is a practice often overlooked.
Knowing to display time-series values in
a line chart (the majority of the time – there
are some exceptions!), why 3D charts can
ensure the right presentation software is
made available. Furthermore, time should
be granted in the development phase
to ensure industry-standard and bestpractice presentation methods are applied.
A BI solution can only be successful if
sufficient time, care, expertise and investment
are given to how data are presented back to
the end-user.
intimate understanding of what decision
makers need to know. Whilst dashboards
are often developed collaboratively
and informed by a systematic and
comprehensive requirements gathering
exercise, the main focus is on what is
needed to be delivered and less on the
how. Put simply, in a sea of important
metrics presented in a dashboard, how
does the key single message reach out at
its audience?
The impact of misrepresented visuals
on decision making and the ripple effect
through a business is considerable.
To effectively harness the power of BI
solutions and empower individuals to effect
change, the following interrelated concepts
should be adopted across the business:
Investment – financial and cultural
Appropriate funding should be invested to
Decision makers within the
business should support the adoption of
BI and do what they can to encourage topdown buy-in to maximise chances of
successful integration.
Change how a dashboard evolves
Best-practice BI feeds on input from all
levels throughout an enterprise. Effective
visualisation requires going one step
further: deeper collaboration to prototype
and workshop visualisation methods to
iteratively ask questions such as ‘what do
you see when you look at this chart?’ This
approach is less about defined metrics and
more about the semantic understanding,
which, though likely to differ across
individuals, must be factored into any frontend development.
Recognition of the importance of BI
across the enterprise
Investing in industry-standard expertise is
vital, as is empowering those individuals to
deliver a solution that has real impact on
the business by supporting the decisionmaking process.
Embed the solution
Incorporating well designed charting to
sit on quality data is an important step.
However, it’s essential that time is given to
train users on the advantages and
limitations of what a dashboard presents
them with, to lessen the changes of
misinterpretation. Dashboard development
should not be viewed as a single activity
with a clear end point. Instead, it should be
something that is developed collaboratively
and iteratively, engaging with end-users
and adapting its look and feel as necessary
to evolve the solution, in a controlled
manner.
A BI solution can only be successful if
sufficient time, care, expertise and investment
are given to how data are presented back to the
end-user. Neglecting this core area introduces
the risk of ambiguity and threatens the adoption
of BI, undermining the pivotal concept of
informed decision making, which itself is a key
driver in developing any business.
References
1. EMC Digital Universe report, with
Research & Analysis by IDC, April 2014
2015 DIGITAL LEADERS
35
CLOUD COMPUTING AND DATA MANAGEMENT
are often a jumble of project-delivery
technology that has grown in fits and starts
with small budgets, urgent demand, and
– often – little real executive sponsorship.
Suddenly, from a timeline going back to
the late 80s, CIOs now find themselves
the custodian of a veritable garage full of
technological whatnot, most of which just
works, some of which they don’t know
anything about, and much of which is
seen as not coming out particularly well
Image: iStock/160516952
FUTURE DIFFERENTIATOR
The challenge for CIOs is to embed technology into day-to-day operations in a way that is both intuitive
and provides competitive advantage. As technology continues to pervade everyday life and the ‘internet
of things’ peeps over the horizon, Bob Mellor MBCS explains how data centres – once, simply the engine
houses of the organisation – are set to become the differentiator between companies that are simply
capable, and those who excel.
CIOs are faced with a major dilemma: Their
board colleagues expect them to continue
to cut costs, and at the same time expect
them to inject step-change innovation into
the heart of the operation.
How is this possible? And which
technology should they bet on to
address those needs? Mobility? BYOD?
Teleworking? Data analytics? Software
defined network? Data mining? Cloud?
Virtualisation? All of the above? None of the
above? It’s all a little too much.
Such a broad collective of advancing
technologies makes the job infinitely more
difficult than just being in charge of the
company tin.
What IS clear though, is that all the
advancing technologies have a couple of key
36
DIGITAL LEADERS 2015
– actually critical – common characteristics:
The storage of masses of information, the
rapid access to it, and coherent assembly of
it, in a way that enables people to do their
job, and do it better.
Data centres, centre-stage
Industry analysts tell us that we have
published, as a society, as much information
since 2003 as we did in the whole previous
century. This trend is set to continue its
exponential growth for the foreseeable
future. It’s simply a fact, that, with the
advent of digital records, information
previously held in book or paper form,
no longer decays: The restoration of old
BBC reels, for example, onto mp3 format
means that they will no longer – ever –
deteriorate beyond their current state.
Internet traffic is growing at a similar
rate. According to Cisco Projects(i), by 2018,
traffic will reach 64 times the volume of the
entire internet in 2005: The data equivalent
of all movies ever made will cross the
network every three minutes.
The one common theme that should be
evident to CIOs in the clamour of emerging
technologies, explosive end-user growth
and the urgent, pressing demands from
their boardroom peers, is the data centre. It
takes a mere matter of minutes to
draw-up a traceability diagram that shows
the intersection of all these technologies
and requirements, is right there, on the
data centre floor.
The problem is that data centres
data you have on your data centre floor
in a rapid coherent way on their tablet or
notebook, you’re halfway to making the
right investment choices, simply because
ALL the other technologies will require this.
So why not start there?
The ‘trinity’ as far as data centres are
concerned is: fast, agile networks; servers
with adequate processing capability;
tiered storage devices utilising advanced
technologies. But FIRST, CIOs MUST get a
CIOs now find themselves the custodian of a
veritable garage full of technological whatnot,
most of which just works.
in a cost/benefit analysis. Right now, the
CIO who fails to get a handle on their
data centre content in a precise, coherent
fashion is feeling the pain – which will
continue, until they invest some effort in
getting on top of it.
How CIOs ‘do’ their data centre - whether
they own and manage their own, co-locate
with a data centre specialist, or put all they
have into the cloud (or, in fact, a hybrid
approach of all three) - will finally become
a point of differentiation as we continue
through the decade. It’s actually of little
consequence which approach is taken,
provided it meets all the criteria required by
their business – of greater consequence is
HOW they do it; what strategy they come up
with; which direction of travel they set.
Which data centre technologies?
The key to unlocking much of the emerging
technology lies in ensuring the correct
technologies and capacity at the backend.
If your end-users can collate, format,
store, retrieve, analyse and present the
proper understanding of the content of their
data centres. Without this, understanding
how to landscape their ‘trinity’ will be, at
best, difficult, and probably impossible.
‘Software defined’ is an overused
buzzword to describe a perspective over a
data centre where all resources – storage,
processor, memory, network – are simply
nodes over a fabric that can be dynamically
controlled and orchestrated from a console
or consoles, or set to respond automatically
and dynamically to specific conditions.
Once simply a pipedream, today we stand
at the dawn of such a utility delivery in
reality, with many key vendors stepping up
to the challenge and offering solutions.
Take-up, as yet, is not prolific, but many
organisations indicate their intention to
explore, or actually invest in the technology
over the next 12-18 months. This has the
potential to be a game-changer, providing
resources in a utility model that can bring
massive benefit in terms of cost, rapid
deployment, capacity management and
benefits realisation.
Virtualisation is a similarly over-used
buzzword, though its definition should be
well understood as the technology can,
arguably, be described as mature.
The technology allows hardware to be
configured efficiently and optimally so that
processing requirements can be matched
accurately and timely with the server estate
on the data centre floor, without the need
for huge footprint expansion. Organisations
have already seen massive benefit from
virtualisation, and this technology will
continue to grow.
Where software-defined and
virtualisation converge is where the
data centre has the potential to be THE
differentiator for a business – a vehicle for
competitive advantage like no other.
CIOs that can get on top of their current
data centre content in a coherent way and
position themselves for timely, accurate
investment in software-defined technologies
for data centre and network, coupled with
appropriate investment in virtualisation,
means they can find themselves in the
enviable position of having an estate that
meets – or enables - their boardroom
colleagues’ apparently contentious
requirements for embedded technology
to meet demand whilst still cutting costs.
Additionally, they’ll have a future-proofed
strategy for flexibility and capacity
management with maximum efficiency and
little waste.
For CIOs, the challenge is to get there as
rapidly, as risk-free and as inexpensively as
possible.
References
(i) USA Today: ‘As Internet surges, data
centers gear up’ Oct 25 2014
2015 DIGITAL LEADERS
37
CLOUD COMPUTING AND DATA MANAGEMENT
1. What do you do, or what might you
do, with the information – in achieving
what purpose?
2. What is the assessed or estimated net
value (benefit less cost) of the
information that you are getting, or
giving, or might get or give – including
your and/or other people’s time-cost?
INFORMATION MANAGEMENT
Image: iStock/175861511
In this information age in which we live we have yet to arrive at authentic information management.
With this in mind Dr C. James Bacon discusses information from a holistic point of view, namely the total
information resource.
Information management (IM), as it’s
normally understood, is really about the
management of information technology, or
perhaps data management and software
tools.
Similarly, the chief information officer
(CIO) role isn’t really about information
either; it’s about technology. Although,
when the role was first created following
the 1977 report of the U.S. Commission
on Federal Paperwork (chaired by Forest
W. Horton and otherwise known as the
Horton Report), it really was about the
management of information as a strategic
resource, rather than the technology
management role it later morphed into.
What I want to look at here is a much
wider understanding of information and
a much broader concept of information
management, what I’ll call authentic
information management (AIM).
Let’s consider Pareto’s 80/20 principle
which states that, for many events, roughly
80 per cent of the effects come from 20
per cent of the causes so it wouldn’t be
too surprising if just 20 per cent of all
information in organisations is actually
useful. The rest is useless or less than
useful. If true, that’s a huge waste of
resources and a big drag on efficiency.
Not only that, but the less-than-useful
stuff is blocking out the useful, and this
38
DIGITAL LEADERS 2015
has big implications for overall, systemic
effectiveness – not to mention people
effectiveness.
For example, back in 1955, the British
chain department store Marks and
Spencer (M&S), undertook a famous
information reduction exercise called
Operation Simplification in response to
rising overhead costs and falling profits.
The well-documented end result was
reported to have been an 80 per cent
reduction in paperwork!
But the reduction in paperwork didn’t
just convert into cost savings. It was
also reported at the time that there was
evidence everywhere of a hidden treasure
of ability and creativity that had been
unlocked.
An authentic CIO
So how much effort, time and resource
is spent on data management compared
with information itself? The former is
easier to get your head around because
it’s specific, it’s tangible, and there are
software tools for it.
Of course effective data management is
vital, particularly for data quality, because
it supports information reliability.
But it may be that authentic information
management (AIM) is the next frontier
in making effective use not only of
information and communications
technology (ICT) in organisations, but also
of information itself and as a whole.
So how do you go about enabling AIM?
The first thing might be the appointment
of an authentic CIO, meaning that they will
have overall responsibility for
promoting the effective use of all
information in the organisation as a
strategic resource, with the present CIO
re-named the chief systems officer (CSO)
responsible for the overall management of
information
systems and business processes.
In some organisations there is now a
separate CTO, responsible for the overall
management of infrastructure. So what we
might need in bigger organisations is (1)
an authentic CIO in a staff/advisory role to
the CEO, (2) a chief systems officer (CSO);
and (3) a CTO. In smaller organisations
these three functions would, at least, need
to be recognised in some way. And clearly,
there would need to be close linkage and
harmony between all three functions.
Secondly, there are two fundamental
information questions that an authentic
CIO might ask and promulgate for
everyone to ask when it comes to
identifying useful information in the
organisation:
Implicitly, business analysts and/or
management accountants have a key role
to play here.
A third thing that might be done is to
address that ‘big bogey’ of information
overload and its consequence of reduced
people effectiveness, namely email. But
a one-off exercise won’t do it. It needs to
be ongoing, and the whole culture of the
organisation might need to be moved in
changing email habits, which means a
need for (a) top-down governance and
leadership, and (b) bottom-up and lateral
collaboration and support. There isn’t
enough space to go into detail or justify
them, but here are some ideas to be going
on with:
•
ake it mandatory for (a) people to be
M
email-free and text/phone-free when
on annual vacation leave, (b) people
to be email-free and text/phone-free
with an ‘Internet Sabbath’ on Sundays,
•
•
•
•
•
trained in the features and functions of
the email platform used.
Promote the concept of the processing
burden, and the aim of minimising it
(with the support of ESAs) by
composing all emails to make them
clear, concise, structured and relevant
from the receiver’s
perspective.
Following the example of total quality
management (TQM) in business
processes, introduce and promote the
concept of info customer and info
supplier, with the aim of giving and
getting information quality and value
in email exchanges.
Don’t check for incoming email more
than twice or a maximum of three
times per day, but put your normal
schedule and response policy below
your signature.
Make your general goals/aims known;
put them below your signature, and of
course keep them reasonably current.
The essential aim here would be to use
email as the thin edge of the wedge in
getting AIM off the ground and embedding
it in the culture of the organisation.
•
•
•
•
•
SDSM: A re-think
Lastly, as for systems development and
support method (SDSM), there might be a
Effective data management is vital,
particularly for data quality, because it
supports information reliability.
and (c) incoming messages during
annual leave times being met with
an automatic response indicating: (1)
another person suggested for contact
if/as needed, and (2) that the incoming
message will be automatically deleted.
Have trained email support advisors
(ESAs) around the organisation, these
being client-users who: (a) have good
written and spoken language skills, (b)
are good networkers, and (c) are well-
flip side is that IS, and the way they’re
developed, can be deadweight for the
organisation in terms of (a) fitting real
needs, (b) the cost of these systems and
their maintenance and support, and (c)
their impact on information overload – and
its potentially serious effect on people
effectiveness and physical/mental health.
But what’s behind all this?
Well we could start with the SDSM itself,
the reason being that the SDSM and its
people and organisation interactions can
do a lot to either encourage or negate
authentic information management.
So here are a few more ideas to be
going on with (and the use of words here is
important):
need for a re-think because, judging by the
burgeoning literature and reports on IT
failure, the surveys indicating lack of
joined-up business and IT, and continuing
anecdotal evidence, things are evidently
still not going as well as they might
be – although it needs to be said that it ain’t
easy to get it right!
Few people these days need to be told
that IT and information systems (IS) are
critical to an organisation’s success. The
•
•
bove all, get partnership collaboration
A
between the IT/IS function and
client-users.
Dispense with ‘system requirements’,
and instead use information needs.
Apply the above two fundamental
information questions in an audit of
any and all legacy systems.
Study the work itself for any new or
enhanced information needs, and
encourage everyone in the skill of
business activity modelling in mapping
and reducing the amount and
complexity of work and its (superfluous)
information ‘requirements’.
Bring in and/or promote lean
development with its minimalist,
waste-cutting, InfoLoad-reducing,
net-value oriented approach to
systems development – but make
sure that it is big-picture oriented and
includes the external view.
Make InfoLoad avoidance the philosophy
of the SDSM, and put information
value as the aim, right at the centre of
the SDSM.
A lot more that might be said in recognising
and achieving AIM, but these things will
make for a good beginning.
2015 DIGITAL LEADERS
39
CLOUD COMPUTING AND DATA MANAGEMENT
MODELLING
MATTERS
•
•
Image: iStock/175861511
Stephen Ball, Embarcadero Technologies, explains why, in the world of data management and ‘big data’,
data modelling still matters if you want to get ahead of your competitors.
Managing the volume, velocity and
variety of data in today’s enterprise requires
large teams, diverse skill sets and
complex workflows. Companies often
manage hundreds of applications and
thousands of databases and information
inventories are growing much faster than IT
budgets.
Automation, efficient collaboration and
knowledge-sharing are critical to success.
The current enterprise landscape spans
a myriad of technologies, packaged and
custom applications and on- and offpremise deployment. With hundreds of
applications and thousands of databases,
it can be very difficult to see the big picture
and find the relevant data. Enterprises
require the ability to manage large, diverse
information inventories in order to maximise
reuse, minimise redundancy and make the
best use of their information assets.
Of course, the ever-increasing volume
of data is one of the widely reported trends
in enterprise data that affect virtually
every organisation today. It’s not just the
growth in volume, but also the increase in
the pace of change and the variety of the
data coming out and contributing to the
complexity of the data landscape. There is
tremendous pressure on IT organisations
to manage and control this data.
More concerning is that the percentage
40
DIGITAL LEADERS 2015
of data that is effectively utilised seems
to be relatively small. This may cause
data to be sidelined into silos so it cannot
be discovered across the organisation,
affecting the ability to make good decisions.
Some organisations struggle with
quantifying the value of their data and
justifying the work necessary to manage
it properly. Enterprise data is no longer
strictly relational in nature. It has to be
integrated, rationalised and normalised
for the business to realise its value. There
are tremendous competitive advantages
available to those who can use their data
properly, but it requires that the data
be carefully designed, integrated and
accessible to the business teams who can
create the business advantage.
According to studies by analysts,
information is now being considered as
‘one of an organisation’s most critical and
strategic corporate assets.’ However, the
consequences of poor or inadequate data
modelling are still not widely recognised,
often because those consequences might
not reveal themselves immediately.
In a recent study1 across 530 senior
executives from North America, Asia
Pacific, Western Europe and Latin America
in a broad range of industries, a clear
link was revealed between financial
performance and use of data. A strong
positive correlation demonstrates that datadriven organisations outperform those that
do not have a handle on their data.
The study found that only 11 per cent
of respondents felt their organisations
made substantially better use of data
than their peers, even though more than
one-third of that group was comprised of
top-performing companies. In contrast,
of the 17 per cent of executives who said
their companies lagged behind their peers
in financial performance, none felt their
organisations made better use of data
than their peers.
Marketers now have access to
unprecedented insights about their
customers. Much of this is driven
by automated capture of behavioral
information – if you can see and
understand what customers are doing,
and how they are behaving in relation
to your products and services, you can
use that insight to tune your offers to
the marketplace. A lot of this is now
happening through mobile devices.
There are three widely reported trends
that impact every organisation today:
• A massive increase in enterprise data
coming from more sources, and being
refreshed faster. Both structured and
unstructured data are growing
exponentially.
reduction in the actual volume
A
of enterprise data being utilised by
organisations (approximately 20 per
cent in 2001, approximately five per
cent today).
With more data comes increased
opportunity, but also much increased
risk of data miscomprehension and
non-compliance with mandatory
regulations.
These trends create serious and perennial
problems – problems that are growing
over time and not going away.
So why can’t organisations make more
effective use of information? In short, it’s
information obscurity: enterprise data isn’t
just big, it’s complex. Enterprises have
hundreds of systems and hundreds of
The challenge is to make the data truly
usable. How does the data become
valuable to the enterprise?
A professional data modelling tool has
interactive capabilities that allow for the
development of sophisticated, complex
data models that can be fully utilised in a
production environment. Tasks such as
code generation and reporting functions
are considered to be standard.
By adopting a data modelling tool, an
organisation can not only enjoy the benefits of
fast and user-friendly data model design and
development; it can also more readily share,
re-use and re-purpose data models, allowing
it to improve data quality and compliance
through standardisation and collaboration.
When done well, an enterprise data
architecture is an organised model of a
Having a proper data modelling and
management strategy will result in better use
and reuse of your important data assets.
thousands of data elements. If data is in a
hundred different systems, if it’s escaped
the data centre on mobile devices or
migrated to the cloud, where do you go to
find the right data, and how do you interpret
it? It’s no surprise that most organisations
can’t leverage all of their data - in many
cases, users can’t even find it.
In many companies, IT departments
are being challenged to do more with
less – less funding, less headcount,
and less flexibility. As data volumes
continue to explode, and the demand for
instant, informed decision-making is a
basic expectation, there is considerable
exposure for those that need to make
software selections, or to utilise that
software in an effort to meet basic service
levels. There are continually more apps
running on more devices connected
with more databases. In order to work
more efficiently and effectively, database
professionals must use the right tools to
manage their data.
Cost and agility are recurring obstacles.
business’s data and information assets,
which contains a complete representation
of the contents along with metadata that
describes it. Data modelling is used to get a
consistent, unified view of all of a company’s
data and answer the following questions:
•
•
•
hat are the data elements?
W
How are they defined?
Where are they stored?
An enterprise data architecture creates
value by managing data redundancy,
enabling organisations to integrate and
rationalise their data, and increasing data
quality by defining the relationships and
constraints around the data to ensure
that it’s correct at the time of capture. The
enterprise data architecture helps to make
everyone in the organisation aware of what
data is available and allows them to better
understand that data. Data models make
your data more accessible.
Having a solid foundation with an
enterprise data architecture that includes
well-defined models can establish the
structure and process that is needed
to keep up with the volumes of data
that are coming in. Data modelling is
more important than ever before, and
organisations that seek to fully leverage
the value of their data assets must make
data modeling a priority.
Once your data architecture is established,
the next step to take with your data is
to share it across your organisation.
Collaboration and syndication of the data
across the organisation can give your
business analysts and decision-makers the
visibility they need. Modelling is a critical part
of the process to unlock the value of data.
Collaboration creates richer, more
usable metadata. Social engagement
and crowdsourcing collects corporate
knowledge and best practices, while
cross-functional collaboration captures
definitions, taxonomy, policies and
deployment information.
Syndication makes the data available
where and as needed. It allows you to
better manage your valuable data assets
and improves the usability of data and
metadata across the organisation. Once
you can find, know and protect your data,
you’ll be in a better position to automate
and document your deployments.
The benefits are numerous and include
establishing standards that ensure
consistency and improve quality for the
data. Even with the cost of investing in a
new product, you can demonstrate a return
on investment and operational cost savings
in a short time period.
Spreadsheets and workflow diagram
tools cannot scale efficiently to meet
rapidly evolving business needs. Having a
proper data modelling and management
strategy will result in better use and
reuse of your important data assets, and
therefore in better decision-making.
Reference
1. www.cio.com/article/730457/
Data_Driven_Companies_Outperform_
Competitors_Financially
2015 DIGITAL LEADERS
41
ENTREPRENEURS AND STARTUPS
BUILDING YOUR BUSINESS
Image: iStock/477861129
As an IT business, you’re probably technically very strong. But is this what your clients really value about you
above your competitors? Business consultant Richard Tubb MBCS takes a look at how to build the business
that your clients want to have.
Ask any owner of an IT solution provider
business what their top goals are and
invariably their answer will be to ‘grow my
business’. Attracting new clients and
retaining those clients are high priorities in
most business owners’ plans for growing
their business.
Interestingly though, when I ask IT
business owners how they will attract
and retain these clients, I often hear a
variety of answers ranging from offering
new products or services through to
implementing new marketing strategies.
While those ideas are important, I believe
businesses should be focusing on doing
something much more fundamental with
their business. They should be building the
business that their clients want to have.
What do clients actually want?
When I ran my own IT business - a managed
42
DIGITAL LEADERS 2015
service provider (MSP) serving small and
medium sized businesses (SMBs) - I
created the business because I was a very
good technician who was in demand.
As I built my business, I started noticing
that it wasn’t necessarily just my technical
knowledge that kept my clients coming
back to me. It was actually something
much simpler. It was a keen focus on
three key areas that delivered what my
clients wanted.
#1 Build strong client documentation
Take a moment to picture some of your
typical client’s offices, and specifically,
the owner’s desk within those offices. I’m
going to guess that some (and probably
many) of them are messy, with papers and
post-it notes scattered liberally across the
workspace.
Our clients face challenges in their
businesses that get in the way of them
doing what they really want to do. They’ll
absolutely value and appreciate somebody
who helps bring order to that chaos.
What does that mean in practical
terms for you? It means building strong
documentation about your client’s
infrastructure. It means relying less
on storing things in your head - or on
your desk - and focusing on storing that
knowledge in a structured way so that you
can easily call up the information at the
drop of hat, whenever your client needs it.
Let me give you an example of how
strong client documentation can help you
become a hero with your clients through
way of a personal story.
My own IT business was borderline
obsessive when it came to documenting
clients’ businesses. Naturally we had all
the necessary technical information stored
- things like IP address ranges, firewall
port exceptions and the like - but we also
documented some of the areas of clients
businesses that you might not necessarily
think of as ‘IT’.
For instance, one client telephoned us in
a panic once because the air conditioning
unit in their server room was leaking
water everywhere. Electricity and water
do not make a good combination! In
their panicked state, the client couldn’t
remember who installed the air con
unit, let alone whether it was still under
warranty.
Thankfully, we did. We had the
information required and we made the
telephone call to the air con provider and
within an hour they had dispatched an
engineer to site to resolve the issue.
Who received the credit for that fix?
While the client was very thankful to the
air con provider, they expressed gratitude
business that we need to repeatedly
undertake. The next time you perform
one of these tasks I’d ask you to consider
slowing down and documenting the
steps you take. Create a checklist of the
outcomes you might expect so that you
and your colleagues can easily do a ‘sanity
check’ on your work at completion. These
checklists will give you peace of mind that
you’re delivering work to a consistently
high standard.
By focusing on strong checklists, you
make your business valuable. You’re
building the business that your clients want
to have - one that delivers consistently
good service.
Develop partnerships
The third and final key element to building
the business your clients want to have is
actively seeking out and developing
partnerships.
You don’t need to be a jack-of-all-trades. The
real value lies in knowing the right people to
call when you need help in a specific area.
to us for making it happen. They valued
the fact they could rely on us to be
organised on their behalf, and they were
happy to pay for that privilege.
By focusing on strong client
documentation, you make your business
valuable. You’re building the business your
clients want to have - an organised one.
Use checklists
Another way to build the business your
clients want to have is to use checklists.
For those of us who fly a lot for
business, we have confidence in the
pilots of the aeroplanes we’re travelling
on. They’ve received heavy training, they
have a ton of experience in flying and yet
before every flight they still run through a
checklist to ensure everything is in order.
Why is this? It’s because checklists build
consistency and people value consistency.
We all have processes in our IT
Let me be blunt, based on my own
experiences in building an IT business. It’s
virtually impossible to cover all the areas
of IT that your clients will approach you for
help with. From a technical perspective,
it’s tough to be a company that does data
cabling, wireless networks, software
development, web design, marketing and
more.
Far too many IT businesses I see
try to be all things to all people - trying
to serve their clients’ every needs and
stretching themselves too thin. They end
up disappointing their clients by delivering
substandard service.
The good news is, you don’t need to be
a jack-of-all-trades. The real value lies in
knowing the right people to call when you
need help in a specific area.
You probably know a number of
companies who fit the bill already, whether
it be through peer groups, local networking
groups or companies you’ve used
yourselves. Reach out to these businesses.
Understand what opportunities they are
looking for. Build partnerships that will
help both of your respective businesses.
Then, keep your eyes and ears open.
Look for opportunities to introduce your
own clients to these partners.
Far from dilute your own value with your
clients, you’ll find your clients appreciate
you even more for having a strong network
of experts that you can call on to help them.
By focusing on building strong
partnerships with other businesses, you
make your own business valuable. You’re
building the business your clients want to
have - a well-connected one.
Bringing order from chaos
As an IT business owner, if your focus is on
growing your business - adding new clients
and retaining them - then consider three
keys areas that your clients will truly find
value in.
Being organised through strong client
documentation is valuable to your clients
who are looking for help bringing order
from chaos.
By using checklists, your business can
deliver the high-quality, consistent service
that your clients will value and retain.
And by building strong partnerships with
other IT businesses who you can call on to
deliver high-quality speciality services to
your clients - rather than attempting to be
a jack-of-all-trades yourself - you’ll build a
well-connected company that your clients
won’t want to lose a relationship with.
Being technically strong as an IT
business is important, but by focusing on
the three key areas mentioned here and
by building the business that your clients
want to have you can gain a competitive
advantage over those other technically
strong IT businesses who also want to
work with your clients.
2015 DIGITAL LEADERS
43
ENTREPRENEURS AND STARTUPS
However, only a few are good - and
only very few also provide good terms to
entrepreneurs. The remainder, frankly,
operate as exploitative self-serving vehicles
that take much more out of a company than
•
Of which £40k is a repayable loan.
Leaving £30k in the kitty.
Congratulations, you’ve just ‘won’ a place
on an accelerator that costs you 15 per
Image: VStock/128935762
I urge company founders exploring the option to
look beyond the headline numbers offered and
pay attention to the small print.
INCUBATE/ACCELERATE
There are a growing number of so
called business accelerators and
incubators popping up all across
the UK that offer budding
entrepreneurs help and, more
importantly, cash. However, is this
all a bit too good to be true? Angel
Investor Stephanie Allen MBCS
shares her own experience about
this relatively new phenomenon.
44
DIGITAL LEADERS 2015
Accelerators can be incredibly good
for a startup; or incredibly bad. Far too
often, not enough questions get asked
about the appropriateness and benefits
of participation - and, given the variable
quality of the offering, it’s become an
important issue that I don’t think gets
enough attention.
I am privileged to be a mentor to
start-ups on a number of accelerator
platforms - and I am genuinely enthused
by my involvement and proud of our
achievements.
However, not all accelerator
programmes are equal. I have seen and
declined many that are a lot less noble,
and we need to talk and raise awareness
of these more self-serving operators who
exist to only really enrich themselves and
can ultimately actually harm a participating
start-up’s chance of success.
While UK plc. has been very quick to
promote the successes of the growing
start-up scene in London and speaks of
it only in terms of pride, optimism and
potential, we need to shine a light on the
less scrupulous operators, masquerading
as enablers in the market.
There can be no doubt that there is
a need for start-up assistance, and that
there is a ready market for the money and
the mentors it presents. So naturally, in
response to the booming tech environment,
there has been a proliferation of enterprises
moving in, naming themselves accelerators
or incubators.
what faceless investor money can secure.
So I would recommend any entrepreneur
to strip out the components of the offer,
and look at them carefully, answering the
following questions before making any
decisions:
•
•
•
they put in.
I urge company founders exploring the
option to look beyond the headline numbers
offered and pay attention to the small
print. Some may offer up to a six-figure
cash injection, which, to a bootstrapping
entrepreneur, may sound like a God-send…
but, as well as the equity sacrifice you
make for it, you may be asked to pay a fee
(which can be in the thousands) to access
it - of which a considerable element is often
a repayable loan.
One well-known operator, for example,
offers £100k upfront, for 15 per cent equity.
Now that’s a pretty chunky equity slice, but
things don’t look really bad until you look at
the details.
Of the £100k outlay:
•
£30k must be paid to access the loan.
Leaving just £70k in the kitty.
cent equity for a net cash injection of just
£30k.
You get my point
Just as important as the cash element of
an accelerator’s offering is the mentor
network it can provide you with. These are
of equal if not greater value to your company.
It’s very easy to get cash for good ideas
these days, via the smorgasbord of
crowd-funding platforms available. But you
should be more discerning.
It’s not just money - it’s strategic money
that you should be after. The differentiating
feature, and one that can undoubtedly be
worth the equity premium, is the network
that accelerators can introduce you to; the
brands, the industry contacts, the access to
information and even the boring things like
advising on valid grants and tax reclaims.
The potential relationships and business
education are worth measurably more than
•
Do I have to sacrifice equity? And if so
how much?
Is any element a repayable loan? And
if so how much?
Do I need to pay a fee to access it?
And if so how much?
Who are the mentors? Are they
appropriate for my product?
And look at it not just from an
entrepreneur’s perspective, but also an
investor’s. Would I still want to invest in you
after what you have agreed to?
Do contact me with questions - find me at
www.stephaniejaneallen.com
Some definitions
An accelerator is an organisation that
offers advice and resources to help small
businesses grow.
An incubator is a company that helps new
and startup companies to develop by
providing services such as management
training or office space.
2015 DIGITAL LEADERS
45
ENTREPRENEURS AND STARTUPS
SMART SPECIALISATION
Image: iStock/475917063
Christopher Middup MBCS, an associate academic at the University of Derby (UDOL), and Alexander Borg, a
consultant at the Malta Information Technology Agency (MITA), discuss the state of smart specialisation in the
United Kingdom.
Smart specialisation is a new strategic
approach to regional policy that has been
adopted by the European Commission (EC)
as part of a wider initiative to ensure that
European Union states collectively achieve
‘smart, sustainable and inclusive growth’
by 2020 (European Union, 2013).
At first glance, the term may suggest that
the goal of the approach is to achieve very
narrow specialised regional focus, but this is
not the case. Specialisation is the starting point
– so there must already be some core shared
knowledge and skill in a region to exploit.
From that, the ‘smart’ development is further
diversification around that core of specialism
(McCann and Ortega-Argilés, 2013).
In order to achieve this, the EC has
produced a framework for its member
states and their regions to develop
research and innovation strategies for
smart specialisation (RIS3). The framework
consists of six practical steps (European
Commission, 2013):
1. analysing the innovation potential;
2. setting out the RIS3 process and
governance;
3. developing a shared vision;
4. identifying the priorities;
5. defining an action plan with a coherent
46
DIGITAL LEADERS 2015
policy mix;
6. monitoring and evaluating.
research and general innovation initiatives
(Foray et al, 2012).
Although the six steps together are well
supported by the EC (through an ad hoc RIS3
support platform based in Seville (Spain),
with processes, documentation and answers
to FAQs), perhaps steps 1 and 3 are the key
drivers that will determine if a region can
and will achieve smart specialisation.
Step 1 requires a regional champion,
with belief in the potential for regional
development that yields widespread benefit
and feeds into step 3, where a breadth of
local contacts sufficient to develop critical
mass and direction is important.
Regionalism in the United Kingdom
During the last few years, there has been
renewed interest in regionalism in the UK.
There has been a steady devolution of
legislative and administrative powers,
particularly to Scotland and Wales, but also
now extending to regions within England.
Prior to the Scottish Referendum,
extensive regionalism in England was
seen as unlikely (Pearce and Ayres, 2012),
but the debates over Scotland appeared to
also reenergise regionalism in England –
to the point that Manchester will soon be
one of the first UK cities outside of London
to have an elected mayor, with some
associated devolved powers.
Such developments are shaped by politics,
and with a General Election due by May 2015,
future extensions of this are far from certain.
However, considering this apolitically, there
does appear to be a groundswell of energy for
more focus on local economies and how to
develop them individually.
Underpinning smart specialisation is
the drive for social innovation (Foray et al,
2012), where new interactions between
different capacities, competences and
Relevance to ICT
In view of the role of ICT as a driver of
economic growth, innovation and productivity
contributing on average five per cent to
GDP across the EU, the regulation laying
down the framework requires that RIS3
strategies must, at a minimum, include a
specific chapter addressing digital growth
for the prioritisation of ICT investment.
Indeed, in most circumstances ICT is either
directly the product or service type that is
a shared specialism or it is a key
enabling function in technological leadership,
pools of knowledge develop to meet
specific local challenges.
There is little detail as yet of what ‘good’
social innovation may be to support smart
specialisation, or whether such cooperative
groups, when identified, can be either
scaled up or replicated. However, if such
systems can be identified and nurtured
locally, they can be the underpinning
framework for matching regional identity
with technological specialisms.
The UK does have a history of using
European Regional Development Fund
(ERDF) financing to make regionalism
work. In the West Midlands, for example,
the Accelerate programme, which ran
from 2002 to 2008, aimed at a shift away
•
combined, they provide depth and
quality for a particular product or service
type. The idea is that the specialism
should become synonymous with the
region in question, so by reputation
people know where to go for that
product or service.
Competition. Within that, smart
specialisation still leaves space for
competition. In the same way that
there is Champagne and then there
are famous-named providers of
champagne, the opportunity for the
individual to still compete for
reputation and market share exists.
Sitting within a collective that draws
the right attention, entrepreneurial
Specialisation is the starting point – so there must
already be some core shared knowledge and skill
in a region to exploit.
from traditional automotive supplies by
developing innovative new products and
services, with a goal of improving regional
job retention (Ortega-Argilés, 2012).
Although preceding the concept of smart
specialisation, the project embraced many
of its principles: it connected industry
and universities for focussed research; it
consolidated local supply chains to add
value; it involved many local SMEs.
Additionally, it showed that a long-term
commitment to regional development can
yield strong results.
Entrepreneurs and the small organisation
Smart specialisation is a regulated and
funded initiative that allows an entrepreneur
or small, agile organisation to fully exploit
the principle of co-opetition (where
cooperation and competition can
simultaneously be applied and exploited).
•
ooperation. Smart specialisation is
C
founded on the principle of cooperation.
It requires self-organising groups to
establish a shared specialism where,
activity to further specialise should
provide both a focus and an opportunity
for surer routes to markets, especially
vertical or niche markets.
Specialism and mass customisation
In a world that is rapidly evolving towards
universal digitisation characterised by
growing market demands for mass
customisation and consumer empowerment
(Bosch, 2009), disruptive digital technologies
such as mobile apps, big data, cloud, social
media and the internet of things have a
crucial role to play both as a specialism in
themselves, and as powerful enablers of
particular regional specialisms.
The latter could be anything from the
automotive industry to oil and gas, or
financial services to ambient assistive
living. For some specialisms, regional
hotspots will immediately spring to mind,
such as the West Midlands/Birmingham for
automotive, or Leeds for financial services.
In either scenario, to achieve the
desired level of mass customisation,
often requiring skills in developing things
such as service mash-ups, self-configuring
devices and predictive analytics, more
targeted investments in education, research
and innovation will need to be committed.
Smart specialisation is a framework that
could help support the synergies between
entrepreneurial discovery, R&I and regional
specialism.
References
Bosch, J. (2009). From Software Product
Lines to Software Ecosystems. In SPLC
’09: Proceedings of the 13th International
Software Product Line Conference (pp.
111–119). Carnegie Mellon University.
European Commission (2013). Guide
on Research and Innovation Strategies
for Smart Specialisation (RIS3 Guide).
Retrieved from http://s3platform.jrc.
ec.europa.eu/s3pguide
European Union (2013). REGULATION
(EU) No 1303/2013. Official Journal of
the European Union, 56(20 December
2013), 1–1041. Retrieved from http://
eur-lex.europa.eu/legal-content/EN/TXT/
uri=uriserv: OJ.L_.2013.347.01.0320.01.ENG
Foray, D., Goddard, J., Beldarrain, X.
G., Landabaso, M., McCann, P., Morgan,
K., Nauwelaers, C. and Ortega-Argilés, R.
(2012). Guide to Research and Innovation
Strategies for Smart Specialisations (RIS
3) (pp. 1–116). Retrieved from http://
ec.europa.eu/regional_policy/sources/
docgener/presenta/smart_specialisation/
smart_ris3_2012.pdf
McCann, P. and Ortega-Argilés, R.
(2013). Transforming European regional
policy: a results-driven agenda and smart
specialization. Oxford Review of Economic
Policy, 29(2), 405-431.
Ortega-Argilés, R. (2012). Economic
Transformation Strategies: smart
specialisation Case Studies. Retrieved
from: http://freshproject.eu/data/
user/01_public-area/PP3_Policy_impact/
Economic_transformation_strategies.pdf
Pearce, G. and Ayres, S. (2012). Back to
the Local? Recalibrating the regional tier of
governance in England. Regional & Federal
Studies, 22(1), 1-24.
2015 DIGITAL LEADERS
47
Image: iStockPhoto/163652439
GREEN IT
IT RECOVERY
Danny Carri, Commercial
Development Manager of LA
Micro, explains why
organisations should take the
option of IT asset recovery more
seriously.
48
DIGITAL LEADERS 2015
It is not surprising that the concept of IT
asset recovery is being readily embraced
by all kinds of organisations.
It ticks all of the boxes for sustainability
by ensuring that waste electrical and
electronic equipment is disposed of
correctly, and ensures that everything
that is suitable for re-use will be saved
and put to good use. It effectively prevents
computers from going into landfill and
extends the life of the machine – reducing
the carbon footprint.
It is very challenging for IT to be truly
‘green’. IT consumes massive amounts
of energy, and for a typical data centre to
become carbon-neutral, it would need to
plant hundreds of thousands of trees to
compensate for the carbon released into
the atmosphere.
Most of the efforts invested in making
IT greener so far focus on energy savings
within data centres and making devices
more energy-efficient. However the
manufacture of computers and electronic
equipment actually has a greater carbon
footprint, which is seldom discussed.
The manufacture of electronic
equipment, especially semiconductors,
makes intensive use of fossil fuels, energy,
water and raw materials. It is difficult to
find figures that are recent and credible,
but Apple publishes full environmental
reports for its products that illustrate this
point.
Apple estimates that a 21.5 inch iMac
uses a total of 610 kg CO2 during its life
cycle, of which 55 per cent is generated
during manufacturing. Of the remainder,
41 per cent of the emissions are generated
while the machine is in use – this figure
is based upon a four-year period of use
- and the rest is caused by transport and
recycling.
Most electronic equipment is ultimately
sent for recycling, as directed by the
WEEE regulations, where it is dismantled
to extract the materials and metals in
a safe way and extract the valuable
materials: cadmium, chromium, mercury
and lead. For IT and telecoms equipment,
the targets are currently 75 per cent of
materials to be recovered, and a minimum
of 65 per cent to be recycled. It is likely
that these targets will become stricter in
future.
However, if a computer can be sold to
a new owner who will use it for a year or
two, this extends its useful life, and is a
far more sustainable option than simply
recycling the materials.
Some organisations like to donate their
unwanted equipment to voluntary groups
or charities, but what is particularly
interesting is how the secondary market
refurbished machines are typically models
that have been tried and tested, and they
are usually sold with a manufacturer’s
warranty or a warranty from the reseller.
Warranties vary from one manufacturer
to another, but it is normal for refurbished
machines to be supplied with warranty
periods of 30 days, 90 days or one year,
and they can be offered for as long as
three years.
Organisations that have surplus
inventory and wish to offer it for resale
can contact a broker or IT asset recovery
specialist who will purchase the
decommissioned hardware and remove
it. The machines will then be inspected,
assessed for condition and usability,
and designated for refurbishment and
re-marketing, re-use or recycling, as
appropriate.
A surprising amount of decommissioned
IT equipment is in perfect working order.
Other equipment is carefully taken apart
and split into its component parts – CPU,
Apple estimates that a 21.5 inch iMac uses a
total of 610kg CO2 during its life cycle, of which
55 per cent is generated during manufacturing.
for IT equipment has grown over the
last few years, and how many corporate
businesses are quietly choosing to use
part of their IT equipment budget to
purchase the less expensive, greener
option of re-marketed equipment. They
may designate this equipment for areas
such as disaster recovery or for software
testing, or they may want to match
familiar legacy hardware that they already
have installed.
While brand new systems can come
with bugs and support issues, re-used and
motherboards, memory, network and
graphics cards, power supplies, fans and
so on - many of which can be re-used for
repairs and maintenance. Parts that cannot
be re-used directly are broken down and
sent for recycling.
The chief concern is to ensure that, if
there should be any data remaining on
the disks, it will be removed securely and
never fall into the wrong hands.
In reality, most equipment changes
hands without the hard drives inside.
However, the drives should be data-wiped
to ensure that no data remains and to
comply with the Data Protection Act. There
are several ways of removing data from
disks and drives, so an organisation can
choose the method that fits best with the
level of risk perceived.
The disks can be treated with softwarebased processes such as Blancco, which
removes the data electronically, or they
can be degaussed, where magnetic fields
are used to remove the data.
If an organisation requires total certainty
that the data will never be accessible
again, there are mobile services that will
visit their premises and shred the disks
into tiny pieces, destroying them and their
contents forever. There should be an
audit trail to certify that the data has been
removed.
The Information Commissioner’s Office
publishes useful guidance on deleting
personal data from computers and
archiving it.
They also advise that all organisations
should have a formal IT asset disposal
strategy, and that the responsibility for
asset disposal should be assigned to a
member of staff with a suitable level of
authority, who will know which machines
contain sensitive data and be responsible
for deleting the information.
While it is clear that choosing re-used
equipment is the most sustainable
choice, the financial argument for IT asset
recovery is a compelling one, and we are
sure that the trend for reusing machines
will continue to grow.
Most organisations are very keen to
receive payment for excess inventory
or devices that have become surplus to
requirements, and are happy to dispose of
equipment this way, knowing that it is also
the best option for the environment.
2015 DIGITAL LEADERS
49
GREEN IT
These hardware components
have an intrinsic value reflecting the
consumption of resources that goes into
their production and operation, and finally
into their safe recycling and disposal. So
recognising that we need them, we should
make sure we exploit them fully to harvest
the return they promise us and which we
need to get if this take of resources and
natural capital over their life cycle is to be
more than matched by the savings we are
able to derive from their use in all aspects
of our lives.
LIFE SUSTAINING
Image: iStockPhoto/154321250
Bob Crooks MBCS, MBE, Chair of the BCS Green IT Specialist Group, reflects on the impacts of our
increasing dependency on ICT and digital services in our daily lives.
Our dependency on ICT is driven by
pressures on us to do things more efficiently,
to be ‘online’, to shop, to meet our regulatory
and legal obligations cost-efficiently, and/or
to have the latest ‘gizmo’ to keep up with our
peers.
This dependency is also increasing
pressure on the underlying storage
capacity, networks, power supplies and
data centres required to support those
services, and many of us in the UK, let
alone the rest of the world, have still to
connect.
With basic functionality being delivered
by all access devices, the focus of industry
is now on adding value and keeping or
extending market share through more
cosmetic enhancements, such as colour,
shape or different ways of wearing devices.
Perhaps, given the huge gains made in
helping those with physical limb disabilities
to manipulate and control artificial limbs
and implants sometimes with processors
50
DIGITAL LEADERS 2015
embedded in the brain, we could
ultimately be moving from Google glasses
to Matrix implants?
We should not underestimate the social
pressures to be online stirred up by the
media, celebrities and our own peers on
the playground or at work. Witness the
huge crowds seeking to acquire the next
version of the iPad or iPhone, versions
which may only offer an additional minor
function or two or a new shape/style.
This commoditisation has led to
devices being thrown away for seemingly
frivolous reasons, such as not being right
for the fashion or style of the times or not
having the latest gizmo function button. If
we are to stop generating ever-increasing
amounts of e-waste, we need take-back
schemes such as that regulated for
batteries and close the loop by returning
stuff to the original manufacturer who is
then under pressure to reuse, recycle.
Can that dependency continue to grow?
Certainly the UK Government thinks so,
and is looking at policies and strategies
that promote the digital information
economy and digital-by-default
government services. And manufacturers
and service providers continue to lure
us into more digital investments with
the latest devices and services that
are increasingly taking advantage of
the internet of things and sensors to
enable, for example, control of household
appliances from wherever you are.
We need fast, reliable, secure and
increasingly comprehensive and integrated
digital services, but these come at a cost,
often hidden and not clearly understood at
the point of purchase or use.
Everything that these services require,
from cables to switches to servers,
from mobile to smartphones, from PCs
to tablets, has to grow in capacity and
functionality to deliver the insatiable
appetite we have for digital services.
So how are we doing?
Our consumption of energy to power the
assets that provide our digital services is
growing in spite of the greater efficiency of
the assets themselves. In 2013 research
published on The Register1 indicated that
ICT was globally now taking some 10 per
cent of the world’s electricity supply.
According to DCDi Census figures for
2013, the UK data centre industry used
about 3.1GW of power and this was
expected to rise to 3.68GW in 2016. Data
centres accounted for 1.2 per cent of UK
power consumed in 20132.
UK Government is waking up to the
unique challenges and opportunities
provided by our information economy and
Carbon Reduction Commitments (CRC)
energy efficiency scheme. Instead they
are now being covered by the Climate
Change Agreement regime, a regime
originally brought in for energy demanding
industries.
In return for meeting the targets
set in those agreements, operators
are able to make savings on their
climate change levies. For example,
the agreement reached between the
industry, as represented by TechUK and
the Department for Energy and Climate
Change (DECC) requires improved cooling4.
It is believed that the National Grid, in
coping with a reduction in spare capacity
from 17 per cent in 2011 to four per
cent this year, is in discussion with the
larger data centre operators for help with
keeping the power flowing at times of peak
demand, for example, by having local backup generators switched on and grid power
switched off.
Alongside the increasing power
demands of the industry, our ICT assets
consume other resources just as
voraciously, including oil for plastics, rare
metals for the electrics and displays, and
water in production processes. Whilst
there have been enormous increases in
the efficiency of extraction, it is the pure
We need fast, reliable, secure and increasingly
comprehensive and integrated digital services,
but these come at a cost.
the data centre contribution to that.
Business Green reported3 that data
centres, whilst already contributing around
five per cent of the value of UK goods and
services, have a turnover that is growing
at around 15 per cent a year. Part of
the growth is driven by the rise of cloud
services, which subsequently increases
the sector’s already hefty energy demand.
With the UK being one of the top global
locations for data centres, this has led
to the government in 2014 removing
large data centre operators from the
volumes that remain of concern with
known reserves of metals expected to
fall below levels of demand in the coming
years.
What can we do?
However, there are other stories to balance
this rather gloomy picture. These are not
based on the ‘take’ of the ICT industry but
on the ‘give back’ that ICT can and does
bring to our lives, organisations and services.
We need to invest to meet the resources
demanded by a growing ICT sector, rather
than seek to limit that demand or leave
demand management to market forces,
as ICT now provides critical services that
are and will increasingly sustain our lives
and reduce our dependency on harvesting
fresh resources from the planet’s reserves.
So what does this mean for us as digital
leaders in our organisations? It means that
we need to:
•
•
•
know where our energy and resources
consumptions occur;
look at life cycle impacts from cradle
to grave and resurrection;
know the risks we take when adopting
cloud services.
In BCS, the Green Specialist Group is
seeking to widen the debate and
awareness of these issues through the
coming year, by focusing on three key
themes:
•
•
•
internet of things;
energy and ICT;
education, training and awareness.
Watch the web page5 for more details on
events, lessons learnt and best practices.
Join us and give us your views and
experience in tackling these issues.
References
1. www.theregister.co.uk/2013/08/16/it_
electricity_use_worse_than_you_thought/
2. www.gov.uk/government/uploads/
system/uploads/attachment_data/
file/337649/chapter_5.pdf
3. www.businessgreen.com/bg/interview/2346601/uk-data-centres-poised-topower-up-emissions-savings
4. www.techuk.org/insights/reports/
item/897-what-is-a-cca
5. www.bcs.org/category/10547
2015 DIGITAL LEADERS
51
Image: iStockPhoto/179402809
GREEN IT
DATA TSUNAMI
Neil Morgan MBCS, EEC Programme Manager, Hartree Centre, discusses energy efficient computing and
the challenges of maintaining a sustainable future in the face of the growing data tsunami.
Evolving IT industry services, including
cloud provision and the maturing of realtime, data-intensive computing, is driving
data growth at an unprecedented rate.
Factor in the proliferation of sensor
nodes and the internet of things (IoT),
embedded systems and the race towards
exascale high performance computing and
it’s clear that the energy implications of
these changes are a major challenge.
Finding ways to reduce the energy
consumed by our computing devices,
whether they are mobile phones, data
centre servers or supercomputers, whilst
sustaining the performance levels, is the
most significant challenge the computer
industry faces today.
IT has always been a rapidly evolving
52
DIGITAL LEADERS 2015
sector but significant changes in the IT
market such as the increased uptake of
storage and computing in the cloud and
the maturing of other technologies such
as data-intensive analytics have led to an
unprecedented increase in data volumes.
The management of data in all stages
of its life cycle - from creation through
to destruction, and the need to analyse,
integrate and share data often in real
time - has driven a surge in demand
from consumers and businesses alike.
The impact in terms of global energy
consumption has been dramatic.
Extrapolating from Jonathan Koomey’s
report of 2011 to US Congress we can
estimate that data centre energy usage
for 2015 could reach 600TWHr at a cost of
£45Bn per annum.
Looking back over the last 30 years
it’s easy to see how we have arrived
at this point. The development of the
microprocessor in the early 1970s has
had a truly profound impact. Indeed it’s
arguable that no other technological
advance can equal the scale and scope of
its impact in such a short period of time.
The humble silicon chip has driven
the digital revolution and in doing so has
influenced nearly every field of human
endeavour. From the phone in your
pocket to the world’s most powerful
supercomputers – the benefits have been
immense. Who now could imagine living
without their smartphone, tablet or access
to the internet?
While these consumer devices and
applications have changed the way
nearly all of us live our lives, the growth
in computing power, both in terms
of hardware and software, has seen
dramatic changes in scientific research
and industry as well.
High performance computing and the
ability to undertake large-scale complex
simulations within a short period of time
and provide accurate consistent results
have formed the basis for modern scientific
research and industrial product development,
helping to solve problems previously thought
too large and complex to attempt, such as
sequencing the human genome or identifying
the existence of the Higgs Boson.
The microprocessor and specifically
the multicore microprocessor are now
ubiquitous, embedded in a huge array of
devices. With the development of the IoT
this trend will only increase. As of 2014
there were 1.75 billion smartphones in
use globally and it is anticipated that in
2015 we will reach two billion personal
computers – consuming approximately
280 gigawatt hours of electricity per year.
It’s worth pausing to consider that these
numbers don’t take into consideration
embedded digital devices, wearable
technologies or the IoT, which are rapidly
energy consumption – in UK data centres
alone it is currently estimated to cost £5.6
billion annually.
This energy cost has global implications
as we attempt to reduce our reliance on
fossil fuels and limit the effects of climate
change - a 2008 report by McKinsey
suggested that data centres produced
about 0.2 per cent of all CO2 emissions
in 2007, but with the rapid growth of this
technology could rise to as much as 340
megatons of CO2 by 2020.
Traditionally the key focus of systems
designers has been on performance and
faster processing in a smaller form factor
without attention to the power consumption,
but this is no longer sufficient.
Indeed for the past 10 years or so it
has not been possible to design new
processors in the traditional way (by
increasing the clock frequency of a single
core processor) due to the constraints of
physics (the processors would be very very
hot and consume lots of power!), leading
to the multi and now many-core processor
designs. Developing software for such
hardware requires specialist effort and has
led to many new programming paradigms
and algorithms, which aim for optimal
performance and energy consumption. If
growth is to continue as anticipated and we
The growth in hardware is also mirrored by data
growth, which current estimates indicate will reach
40 billion terabytes by 2020!
offer significantly increased performance
for certain applications at a reduced
energy cost. Novel cooling approaches,
data centre infrastructure management,
intelligent power state management and
advanced job scheduling will support
the development of the low-power data
centres of the future.
This is only part of the story however. If
we are to achieve sustainable efficiencies
to meet our growing data, compute and
energy demands, we must take a truly
integrated approach to all those factors
that contribute to energy consumption.
As well as incremental improvement in
existing technologies the development
of new architectures and programming
paradigms are essential if we are to
find a sustainable solution to meeting
our growing computational energy
requirements.
It is essential that we change the
attitudes of designers, developers and
users so that the metric is not FLOP/
second, but rather FLOP/Watt. This is a
significant shift in an industry that has
been dominated by Moore’s law and a
preoccupation with hardware performance
at the expense of software.
The successful realisation of highperformance, low-energy computing can
only be achieved through a greater focus
on improved software design - compilers,
tools, algorithms and applications - that
can fully utilise the potential parallel nature
of today’s multicore processors.
emerging sectors. The growth in hardware
is also mirrored by data growth, which
current estimates indicate will reach 40
billion terabytes by 2020!
The all-pervasiveness of digital
technology has had a significant effect
on our global energy consumption. The
manufacture of digital devices in all
their forms, whether embedded lowpower systems or high performance
supercomputers, along with the creation,
transmission, storage and use of data,
have all had a significant effect on global
Notes
The Energy Efficient Computing Research
Programme at the Hartree Centre aims
to investigate innovative new approaches
to the development of both hardware
and software to address this challenge.
Building upon the expertise already present
within the Scientific Computing Department
and through collaboration with research
and industry partners it will act as a centre
of excellence, enabling scientific advances
and delivering real economic benefit for
the UK.
are to reap the benefits, we must reduce
the energy requirements of our devices in
all sectors through a co-design approach.
Significant progress has been made in
some areas already. The development of
low-power processors such as the ARM
64v8 provides an energy-efficient, smallform factor alternative to traditional server
technologies.
Other developments such as
heterogeneous computing that integrate
traditional processors with co-processors
(e.g. NVIDIA GPU’s or Intel’s Xeon Phi)
2015 DIGITAL LEADERS
53
INTERNET OF THINGS
get better use out of our buildings, that is,
to create intelligent buildings:
•
•
1. Treat all building services as IT systems.
2. Create interactions between building
services assisted by open protocols.
INTELLIGENT BUILDINGS
Image: iStock/455043567
Ed Macey-MacLeod MBCS, Senior Associate Director at IPT Design, encourages us to consider buildings
as being like large IT systems and provides some helpful suggestions as to how to turn your office into
an intelligent building.
I am going to presume that you are reading
this sitting in, or on your way to, your office.
Your office is likely to be a single room
forming part of a larger floor plate or a
desk on the floor plate.
You are likely to have used a security
card to gain access to the building, maybe
even within the lift that took you to your
floor. The temperature of the space will be
comfortable and the lighting adequate.
In construction we refer to the systems
within a building that operates, conditions
and secures the building as building
services. Increasingly these building
services are IT systems.
What does this mean? Does it have any
effect on you as an employer? Certainly it
could. More often than not building services
operate independently and they could
benefit from integration with each other
and from being managed as an IT system.
54
DIGITAL LEADERS 2015
By way of illustration, the BMS (building
management system) is the building
service that controls the heating and
cooling within an office. It will often be
programmed to start within certain
scheduled times based on a normal
working week.
If this system was set up to review the
electronic corporate calendar, it could
take the decision not to start heating the
building on a bank holiday or when there
is a corporate away day or the office is
shut for the Christmas break.
Other examples become more intricate
and sound far-fetched but are possible;
from virtual barriers detecting when a
visitor has strayed into an area they are
not meant to be in, to a diverted lift to ferry
an important client directly to the correct
floor for a presentation.
We should not dismiss the idea of
an intelligent building and its potential
usefulness because ways the building
could function sound outlandish. An
intelligent building relies on a strong
foundation; the interactivity among
systems can be programmed after that.
Almost all of the building services
in your building will rely on a data
network and certainly a server to control
themselves.
Sometimes the server is called the
headend or supervisory PC. I have seen
it relegated to the workshop, where it
becomes clogged with dust and debris.
Rarely are these systems ever treated
like IT systems, with wayward cables,
no software patching applied to the PCs
and no proper access control to prevent
unauthorised access or to track who has
made a change.
So I propose two things to enable us to
When we do look behind the curtain at
the BMS or the security system, we as
IT practitioners will see devices we are
familiar with (or were familiar with) - there
are edge devices, network cabling, hubs,
switches and servers.
Whilst these systems might resemble IT
systems, they are often not treated as part
of the organisation’s IT equipment, proper
service management is not applied and
they are left forgotten and untouched.
This neglect is unwise because we
can exploit these systems to increase
productivity and downtime. IT managers
are adept at following procedures and
ensuring that systems do not routinely fail;
pre-emptive care is applied rather than
reactive care.
This cannot always be said of building
services, which may not be constantly
monitored for faults or developing faults.
This is likely because the systems are
maintained by a contractor who has to
visit the site to review the system and only
ata produced by the building services
d
can be collected and put to use;
interaction among different systems
can be enabled.
Examples of the data that can be produced
and analysed include:
•
•
•
he frequency with which people
T
access certain rooms or attempt
to access rooms they should not.
Analysis of this activity might show
persistent attempts to breach
security or could be used as a time
and attendance record.
The carbon dioxide detectors in
meeting rooms could be reviewed
against the room booking system
to highlight where rooms are being
underused.
People’s movements based on
their mobile phone association with
Wi-Fi access points or mobile phone
tracking antenna can be collected
for analysis. This information could
show dwell time in the cafeteria,
in the hallway or be used to help
locate someone when the need
arises.
This analysis can be automated and
Almost all of the building services in your
building will rely on a data network and
certainly a server to control themselves.
does so infrequently.
If these systems are brought under the
purview of the IT department then:
•
•
•
•
their networks can be administered
correctly both physically and
logically;
IT service management processes
and procedures can be applied;
access to the service can be secure
and repudiation will be possible;
secure remote access to interested
parties can be given;
dynamic, offered to the user as concise
alerts to help manage their building rather
than having to trawl through reams of data.
Interactions among systems could be as
simple as associating users’ access control
cards to their desktop PC and phone to
power both down when they swipe out of
the building or conversely turn both on
when they enter the building.
Security systems can be integrated to
allow interaction between access control
and CCTV systems so that the area where
an alarm has been triggered is shown
automatically on the CCTV screens.
When presence detectors inform the
lighting control system, BMS and solenoid
valves that no one is present, all of the
systems ramp down. This could be
correlated with the access control system
to determine if someone may still be in the
office, but undetected.
Interactions can also take place
between the AV and security systems.
During a building lockdown scenario
(shelter-in-place), the video screens can
be automatically tuned to an information
screen requesting people stay in their
rooms.
These are just a few of many beneficial
interactions possible among integrated
systems, which together create an
intelligent building.
For systems to interact adequately with
each other they need to exchange data
that they can both understand. We rely on
building services using open protocols and
a number of systems talking multiple open
protocols. These can normalise the data
received and push it out to the dependent
systems.
In the past systems were designed
using proprietary protocols and a number
of these systems are still installed.
Interactions among these systems are
difficult as they are not designed for
interactions; however, integration can
be achieved using gateways and newer
equipment.
In conclusion, when reviewing the
existing commercial real estate operated
by your organisation or when assessing
new office accommodation, consider the
advantages of managing all the building
services as IT systems. Develop a building
IT policy to include within the wider
organisation IT policy.
This will describe how the organisation
approaches the new customers of facilities
management and security. It will also
record what interactivity is required
between the systems to help the business
function, thus creating an intelligent
building.
2015 DIGITAL LEADERS
55
INTERNET OF THINGS
GLOBAL VISION
Image: iStock/536924427
Gordon Fletcher MBCS, International Operations and Information Management, discusses the
ever-growing promise of the internet of things (IoT).
‘The internet of things is a vision. It is being
built today’1; this is the simple but bold
claim of Council, a European think tank
established to manage and consult about
the internet of things.
Beneath this single statement lie years
of complex development and a multiplicity
of activities, products and business models,
which currently pull understanding of the
internet of things in so many different
directions.
Depending on how you choose to chart
its history, the internet of things is at least
23 years old and arguably started with a
webcam pointing at a coffee machine2.
Enabling a series of streaming images
from a camera that could be accessed
through a web browser was revolutionary
in 1991. At the time the coffee-cam was
generally presented in the popular media as
a gimmick. This first step in an internet of
56
DIGITAL LEADERS 2015
things was almost dismissed because the
majority of commentators could not imagine
any wider application or a viable business
model for webcams.
Now, in 2015, webcams are
commonplace in the home and business
environment as well as regularly being
used on traditional television news
broadcasts. For most users they would not
even consider their webcam as part of the
internet of things.
The retrospective internet of things
Perhaps more confusing for the
understanding the internet of things is the
two different directions of development
that currently showcase the concept to a
popular audience.
Currently, the most visible activities
around the internet of things come from
the very many projects, including hobbyist
and private projects that set out to create
a form of retrospective internet of things.
In effect, these projects are creating an
internet of things by adding a unique
machine-readable tag of some type onto
an existing physical item. Some of these
projects take the most low-tech approach
by adding a QR code onto an item that has
a meaning or a backstory3.
Being low-tech the internet is reduced in
these projects to a one-way communications
channel for the broadcast of relatively static
information. Other projects promoting the
retrospective internet of things offer more
dynamic data by incorporating a range of
environment sensors into the tag4.
The many variants on these tags
generally draw in some way on Apple’s
iBeacon technology5, which has primarily
focused on sensing the proximity of other
devices at a single – unique – location. The
primary application of iBeacon so far has
been to push out customised vouchering to
suitably activated mobile phones.
What becomes clear in the development
of a retrospective internet of things is that
any unique thing, even if it is online, by itself
is of little value. A network of sensors and
the monitoring of changes in these sensors
will produce an interesting and compelling
use case for a retrospective internet of
things. However, a sufficiently large and
successful retrospective internet of things is
yet to materialise.
The prospective IoT
More speculatively and still primarily the
domain of R&D departments in larger
technology companies is the prospective
internet of things. The development of a
prospective internet of things is the riskier
and more fraught route for development.
Misterhouse6, a very early home
automation system written as a private
but significant attention put onto the
increased range of options, content and close
integration with ‘things’ that we already take
for granted as being online – phones, tablets
and laptops.
By stealth and very patiently, a form of
Misterhouse is developing with household
devices that actually benefit from being
digital by being able to talk to each other.
Over time the Philips Hue Lighting System
and other more and more mundane
devices will become part of this network of
things as the benefits of their networked
interconnectivity is recognised by everyday
consumers.
The prospective internet of things embeds
connectivity into the core of a device – for
these things being offline appears to be
almost nonsensical and when they are
offline the user experience is a significantly
frustrating experience.
Cisco’s Planetary Skin project8 is exactly
the type of internet of things that could
The challenge right now is to create a small
aspect of a prospective internet of things that
produces real value and benefit.
project, has been in development since
2000 with some interest and
installations, but is far from receiving
any sort of general acceptance. LG also
attracted significant headlines in 2000
with its Dios refrigerator7.
Fourteen years later the world is still
trying to understand the benefits of an
internet-connected fridge while silently in
living rooms around the world television sets
have increasingly become ‘smart’.
This quiet change has happened with
less emphasis on the device being online,
not be realised even in part without the
support of a large technology company. The
concept of creating a planetary network
of interconnected sensors was part of the
prospective internet of things in its ambition,
but is too tied to the ethos of the retrospective
internet of things in its development to be
truly viable.
Perhaps the most challenging and
innovative project for the prospective
internet of things is IBM’s Adept project9.
The project makes use of the Bitcoin
blockchain and BitTorrent as well as a
secure communications protocol to enable
a more sophisticated fully communicative
internet of things. Embedding the use of a
cryptocurrency technology into the system
adds something genuinely new into digital
things – recognisable exchange value.
Integration of things with a currency and
giving them a value would add new impetus
to the development of a network thing
where real exchange value could be built
directly into items and, in effect, integrate
things automatically into a familiar economy
that can be managed by existing tools and
platforms including Amazon and eBay.
The internet of things is being built today.
It was being built 20 years ago and it will
still be under construction in 20 years’ time.
The challenge right now is to create a small
aspect of a prospective internet of things that
produces real value and benefit. The internet
of things is not a marketing device and will
only be of use when consumers can see
transparent value, not just because it is ‘on
the internet’, but because their lives genuinely
benefit.
References
1. www.theinternetofthings.eu
2. www.technologyreview.com/
article/401059/coffee-cam
3. www.talesofthings.com
4. http://estimote.com
5. https://developer.apple.com/ibeacon
6. http://misterhouse.sourceforge.net
7. www.digitalartsonline.co.uk/news/creative-lifestyle/lg-launches-internet-fridge/
8. http://newsroom.cisco.com/dlls/2010/
hd_092710.html
9. www.coindesk.com/ibm-sees-roleblock-chain-internet-things
2015 DIGITAL LEADERS
57
Image: iStock/534704003
INTERNET OF THINGS
HUB OF ALL THINGS
With the ever increasing concern over the lack of privacy of personal data in the forefront of so many
people’s minds, Peter Ward, a director at Sunrise Consulting, shares some good news regarding a new
programme, Hub-of-all-Things (HAT), that he is helping to develop, along with the University of Warwick,
which aims to address many of these privacy fears.
2014 saw a big increase in our awareness
that our personal data, now collected by
everyone from banks and supermarkets to
our fitness monitors and home automation
systems, ends up being owned by the
companies who collect it and not by us.
While some companies will sell our data
back to us, others do not even allow us to
see what they have recorded about us.
As data collection has increased, so has
our concern that our data is not being held
securely. Whether it’s a result of deliberate
58
DIGITAL LEADERS 2015
attacks or accidental failures, we’re no
longer certain that our private data will
remain private. And as the growth in the
internet of things (IoT) permits us and others
to collect an increasing amount of private
data about our lives, it’s time to consider how
we can ‘reclaim our data’.
Reclaiming our data
A group of UK researchers is leading the
field to change the way that our personal
data is owned and shared.
The Hub-of-All-Things (HAT)
programme (http://hubofallthings.org)
starts with the position that data collected
by and about us should belong to us. It
should not belong either to those whose
tools we use to collect it or to those who
collect it as we interact with them.
From this starting point, the programme
is researching how we might exploit our
data for our own benefit when it’s all
brought together.
Collecting the data together from multiple
sources puts it into context. Those who
work with big data would love to have that,
but they have to work with data where
the context has been removed through
aggregation and anonymisation.
HAT data, on the other hand, can be
fully contextual because it can include
information on everything that is going on
in our lives. This makes it potentially very
valuable.
As a simple example, imagine giving
a company permission to read several
of your data streams in context: to know
what you eat using information from your
supermarket loyalty cards, how much you
exercise extracted from your fitness monitor,
and the trend in your weight sent from your
IoT-enabled scales.
It could analyse those streams together
in context and offer advice or products
that could help you personally in a way
that just would not be possible while those
pieces of information remain separate and
anonymised. It becomes cost-effective for
companies to treat you as a ‘market of one’,
providing you with personalised offerings
relevant specifically to you.
The HAT programme is, therefore, not just
interested in making it possible to collect
and store our data securely, but also in how
we might then exploit the value of our data
in context. It is creating a marketplace for
our personal data so that we can exchange
it for goods and services that are personally
relevant to us as individuals.
The personal data vision
It starts with an encrypted database. We
anticipate that this will be made available to
you by a ‘HAT provider’. It may be in your
home – perhaps on your laptop, your router
or your home automation system – or in
the cloud. In this HAT data store all your
personal data from multiple ‘HAT-ready’
sources is brought together securely.
Those sources will use application
programming interfaces (APIs) to post
their data into your database. You will be
free to choose who can read your data.
You will be able to view your data using
an app on your browser, phone or tablet. You
will see how the data from multiple sources
comes together in context to give you new
insights into your own life. And you can trade
those insights with others to get the products
and services you want.
The trading process can take one of two
forms. Firstly, you will be able to install apps
that use data in your HAT database to engage
with companies and other organisations.
For example, there may be an app that uses
your online supermarket’s API to reorder
groceries or cosmetics based on your rate of
consumption. Or an app to share information
with the local homeless shelter on the food
in your fridge that is close to its sell-by
date so they can collect it for distribution
instead of it going to waste. Or an app to
share information on your weight and level
of fitness with your doctor so they can
recommend a new exercise regime.
Secondly, you can grant access to others
to read specified subsets of your data in
order for them to make personalised offers
to you. An example might be to allow a high
street retailer to access information on
the clothes in your wardrobe so that it can
propose the perfect new jacket that enhances
what you already have.
You may charge the retailer a fee to see
your wardrobe or it may offer you a discount;
either way, you benefit financially from the
transaction as well as receiving a personalised
offer that would not be available to you
otherwise. The retailer also benefits because
its offer is shaped by your context and you are,
therefore, more likely to accept it.
Whichever method you use to trade, your
data remains safely stored in your own
private database with its tight security under
your control.
Ready for launch..?
So is this HAT research programme ready
for prime time? Not quite! The concept is
being tested in the lives of a small
number of researchers. IoT-enabled
devices – including some specially created
for the programme – are feeding data into
the researchers’ cloud-hosted databases,
complemented by additional data extracted
from their social media feeds.
The HAT user interface allows the
researchers to analyse their own data,
and apps are being written to support its
exchange for products and services.
But are we ready to stand up to Google
to reclaim ownership of our search data,
and to Tesco for our shopping history? No.
There’s still a way to go. But this research
programme has made a start. It’s showing
that it’s possible for us to own our personal
data and to decide for ourselves who gets to
see it.
In this way, a market can be created that
will allow each of us to receive a return
on the information we share, and it’s a
unique market because that information is
contextual and personalised. It will allow
companies to relate to us personally,
meaning that we can receive offers that we
are much more likely to value.
We’re planning to move into limited pilot
in early 2015 and if all goes well then we
could see a rollout in mid-2015. To facilitate
this we’re considering establishing a ‘HAT
Foundation’ that owns the HAT specifications
and approves HAT database providers. We’re
targeting the first million HAT databases
and have aggressive timescales, but if we’re
going to reclaim our data then there’s not a
moment to lose. If you want to join us then
please visit:
http://hubofallthings.org/join-the-hatrevolution to learn more.
Notes
The HAT program is funded by Research
Council’s UK’s Digital Economy
programme. It involves interdisciplinary
researchers in the areas of economics,
business, computing and the arts from
the six universities of Warwick, Exeter,
Cambridge, Nottingham, Edinburgh and
the West of England. Its objective is to
create a multi-sided market platform for
personal data generated from connected
services and products, in particular from
IoT devices.
2015 DIGITAL LEADERS
59
LEARNING AND DEVELOPMENT
LEADERSHIP DNA
Image: iStock/153794211
Leadership means different things to different people and there are lots of different leadership styles.
With this in mind Beverley Corcoran MBCS, British Airways IT Learning & Development Manager, considers
how we define, develop and grow our leadership skills?
Some common myths about leadership are
that it is all about seniority, hierarchy or
just managing well – not true! Leadership
can be people-project-thought-based or
technical in nature, or be as simple as
leading by example.
Leadership is therefore possible at all
levels of the organisation and at any stage
of your career.
A good starting point when defining and
developing the DNA of your leadership
skills is to create a vision for your career,
along with a draft personal development
strategy on how to achieve it. ‘Easier said
than done right?’ Well, if it was that easy,
we would all have one.
At the outset it might be helpful
to consider the following aspects of
leadership to help shape your thinking:
• Setting direction through a compelling
and inspiring vision.
• Motivating and inspiring others to
engage with the vision.
60
DIGITAL LEADERS 2015
•
oaching and building an effective
C
team to achieve the vision.
So what does all this mean for you as
an individual?
The following top five hints and tips
are some ideas aimed at helping you to
formulate, or enhance your leadership
DNA:
1. Set clear direction
‘Imagination is more important than
knowledge. For while knowledge defines
all we currently know and understand,
imagination points to all we might yet
discover and create.’ – Albert Einstein
What do you aspire towards? Define your
personal vision; it need not be set in stone.
Think about your values, beliefs and
aspirations and ensure your vision lines
up. If you don’t know where you are going,
how will you ever know if you get there?
How can others follow you if the path is
unclear?
So how does this translate to the
workplace?
• Set the scene. Be clear on
accountabilities and what you expect
from others.
• Think big, be transformational. Inspire
others to follow you. Peter Drucker
once said: ‘The only definition of a
leader is someone who has followers.’
Most importantly of all, listen to
others, harness their skills and expertise
to help shape your thinking.
your actions generate the necessary
momentum to drive things forward?
So how does this translate to the
workplace?
• Be an energy giver – drive things
forward with a ‘can-do’ attitude,
inspiring others around you. Charisma
helps drive momentum.
• Demonstrate resilience and
perseverance when the going gets
tough. As Nelson Mandela once said:
‘It is better to lead from behind and to
put others in front, especially when
you celebrate victory when nice things
occur. You take the front line when
there is danger. Then people will
appreciate your leadership.’
• Be authentic – show the real you
and live your values and beliefs. Role
model the behaviours you believe in
and expect from others. Treat others
with respect and as you would wish to
be treated. Ensure this is mirrored in
your body language.
• Communicate openly and honestly
3. Trust and empower
‘As we look ahead into the next century,
leaders will be those who empower
others.’ – Bill Gates
Be forward-looking and define what
competencies you need to succeed. What
do you need from others around you? How
do you bridge any gaps?
So how does this translate to the
workplace?
• Embrace accountability, and don’t
blame others when things don’t go
according to plan.
• Learn from your mistakes and accept
it is ok to fail because this enables
learning. Build a culture around you
where this becomes the norm.
• Consider your strengths and don’t
focus on the negatives or on
correcting perceived weaknesses. Use
this approach to provide feedback to
those around you.
• Do you have autonomy and flexibility?
If you don’t know where you are going, how
will you ever know if you get there? How can
others follow you if the path is unclear?
•
•
2. Motivate others
‘Leadership is the art of getting someone
else to do something you want done
because he wants to do it’. – Dwight D.
Eisenhower
Define your personal brand. How do you
want to be perceived and remembered?
What leaders inspire you? How will
leader that they work for.
•
and truly listen to those around you.
Don’t dismiss their opinions, interrupt
them or pay lip service to their ideas –
focus on building trust.
What motivates you? Teams are real
people with real drivers. Understand
what is important to those around you
– Edgar Schein’s ‘Career Anchors’1
might provide a helpful insight in this
regard.
Never underestimate the power of
recognition, which is not just
monetary. Often a simple ‘thank you’
delivered with genuine intent goes a
long way.
How are you perceived? Think
retention. There is a strong
correlation between individuals
leaving their role and the manager or
•
Do you allow this in others?
Manage your performance and that
of others with honesty and integrity.
Don’t dodge difficult conversations.
4. Build effective networks
‘If there is any one secret of success, it lies
in the ability to get the other persons point
of view and see things from his angle, as
well as your own’ – Henry Ford
You have a vision of the future, you
know what you want to do and you have a
motivated team behind you – how do you
drive through what you need to? How do you
lay the foundations for your future plans?
Answer: by targeting key influencers and
bringing them along with you.
•
•
•
Identify the relevant key influencers
– both now and with an eye to the
future.
Invest time in building the
relationships. Grow and leverage your
personal network.
Consider a mentoring relationship –
you never know where this could lead
in the future.
5. Seek out lifelong learning opportunities
‘Tell me and I forget. Teach me and I
remember. Involve me and I learn.’ Benjamin Franklin
Being visionary is not a snapshot in
time - visions evolve and grow. What
learning have you undertaken, or nurtured,
to ensure continual improvement and
innovation?
So how does this translate to the workplace?
• Actively seek feedback to further
develop your personal brand and
competencies. Give constructive
feedback to others to coach and
develop those around you.
• Seek out valuable experiences or
broader responsibilities that push you
outside of your comfort zone. Embrace
these as opportunities. Create learning
opportunities for others.
• Embed your learning – practice! Allow
others the time to do the same.
• Consider coaching or mentoring, both
for yourself and for others around you.
Nurture talent and leadership skills.
Give back.
To summarise leadership in the words
of John Quincy Adams:
‘If your actions inspire others to dream
more, learn more, do more and become
more, you are a leader’.
Reference
1. Career Anchors Self-Assessment Third
Edition, Edgar H. Schien, Pfeiffer 2006
So how does this translate to the workplace?
2015 DIGITAL LEADERS
61
LEARNING AND DEVELOPMENT
first time he tries? Do you expect failure
as he experiments with different grips
and techniques? Do you wade in and feed
him so he never learns himself? Do you
applaud him for using a spoon instead if
it gets the peas to his mouth, because it’s
met his need to eat in a novel way?
Image: iStock/153794211
How do you treat new ideas within your
business?
What about the child with dismantled toys
on the floor? Do you see a mess, or do you
see, like Edison, 10,000 things that didn’t
work? Maybe that snow-plough front
sticking out from under the sofa would
make a brilliant pea-shoveller.
Can you put the two children together?
Better yet, can you create an environment
where they’ll make the connection for
themselves?
Finally, consider the child happily
playing with Meccano and getting better
and better at building from instructions.
How can you involve them? They’re
probably just the person who knows how
LEARNING ORGANISATIONS
Sara Galloway, Senior BA for NHS Blood & Transplant, discusses how organisations only really learn
when they are given a nurturing environment where they can learn by making mistakes.
If your organisation were a child, what
would they like? Go ahead, take a few
minutes to write down your thoughts. What
did you write?
Maybe you said it’s like a toddler in the
‘terrible twos’ - you’re trying to teach him
how to behave, how to be polite to Aunt
Millie and how to use a knife and fork, yet
organisational life seems to be one long
battleground over whether peas belong on
the plate or on the floor.
Perhaps your organisation is curious she has her hand in every open cupboard,
toys are disassembled to ‘see how they
work’, then dropped where they fall, and
every other word she says is: ‘Why?’
Or did you describe a child who quietly
62
DIGITAL LEADERS 2015
goes to school, does his homework on
time, goes to football practice and doesn’t
complain about having to eat vegetables
before dessert?
You probably have all of these aspects
somewhere in your organisation - the
colleagues who resist change, those who
are full of ideas of how to make things
better, and the ones who just get on with
delivering against the bottom line.
How do you harness the best of all
three? How do you take the vitality of the
curious for tomorrow’s success, temper it
with the pragmatism of surviving today so
there is a tomorrow, and have the strength
of will to define what business you’re in and what you’re not?
Organisations must learn, just
like children, about their operating
environment: what constitutes success
and how to achieve it. More often than
you might think, ‘success’ can involve
overturning received wisdom and finding
new ways to meet customer needs.
Organisations are made up of people,
and frequently we hire the young and
those with an eye for improvement, yet
our organisations are not willing to learn
or, more accurately, to fail. Failing is an
integral part of learning but while visible
failure costs time and money, the invisible
failure of doing nothing costs more in
the long run. Do you expect a child to
successfully spear peas on a fork the
organisation I have seen began at the
grassroots. A government department with
a mature workforce spent several years
recruiting bright young graduates.
Faced with an aging intranet with
heavily restricted publishing rights, yet
a frequently-stated value from senior
management of ‘share knowledge by
default’, someone in a technical area
installed an empty Wiki on the server
under his desk. There was a kerfuffle.
There were cries of: ‘That’s not how we
do things! Look, we have an intranet with
pages for each team. Why can’t you just
use those?’ There were complaints from
the in-house library about how it would be
impossible to categorise information if just
anybody could create a page and link it up,
however they saw fit.
And yet the Wiki grew. A few pages on
teams here, a couple of categories for
frequently used jargon there. Some formal,
business-sanctioned projects started
using it to share their minutes and design
decisions.
When those with formal authority - from
line managers to directors - support novel
approaches and solutions, ideas will flow.
to build a handle for that snow-plough to
make it a usable piece of cutlery!
So that’s all very well in a hypothetical
playgroup, but what about in the real
world? From my earliest working days
in admin support to my current position
as a senior leader, the key determinant
of learning and change has been
organisational culture.
When those with formal authority - from
line managers to directors - support novel
approaches and solutions, ideas will flow.
In any organisation, dedicated staff with a
good idea will seek out someone with the
power to support it, but wouldn’t life be
better for everyone if they didn’t have to go
looking?
The most, indeed, only, successful
example of becoming a learning
The Wiki wouldn’t die. Information
security got involved, and some policies
were agreed about what could and couldn’t
be added. And, quietly, a senior manager
backed the Wiki. When the heat got too
much, he stepped in, explained why he was
supporting it, and stepped away again.
More teams added pages. People
started adding personal pages and
career histories, and past experience, and
volunteering themselves as experts in
certain systems, fields and topic areas.
More projects used it. The complaints died
down.
The senior manager, quietly, sponsored
a team, open to anyone who was
interested, to meet for coffee to discuss
other ideas that would be useful. An instant
messenger service was born, used initially
by the open team.
There was a kerfuffle. There were
cries about: ‘That’s not how we do things!’
Projects used the IM service to exchange
information between meetings. The Wiki
was formally adopted, and moved to
a live environment server. There were
policies set up around IM chat rooms and
acceptable use.
People connected with each other.
I made more useful connections in
months of seeing who had edited the
same Wiki pages and who was in the
same chatrooms than I had from years of
working in a landmark open-plan office.
This is the age of the knowledge worker,
not the geographical office mate.
Other managers, at all levels, added
their support to staff improving the
working environment. Someone else wrote
a system that figured out who was sitting
at which hotdesk and showed you on a
map, in a single sweep replacing years of
failed and unused magnetic nameplates in
two different locations on each floor.
Many potential objections to this system
were headed off at the pass by those who
had been involved in earlier successes.
There was a smaller kerfuffle this time,
because ‘the organisation’ had learned that
these ideas were worthy of a trial and it too
was eventually supported formally.
There were other ideas that started,
flared briefly, and failed, such as a
collaborative document editor that
simply didn’t have a ‘market’ at the time
it appeared. But the culture had become
one of accepting ideas and keeping what
worked.
All this successful productive change
happened in a department of over 5,000
people because one senior manager said:
‘Yes, that looks like a good idea and I will
support you’.
That led to a small success. Other ideas
followed, with other successes. Other
managers added their support, and the
organisation collaborated, and learned.
Will you be that person?
2015 DIGITAL LEADERS
63
Image: iStock/477022353
LEARNING AND DEVELOPMENT
MEANINGFUL MENTORING
Jill Dann FBCS discusses nurturing the creation of business value through meaningful mentoring.
Effective mentoring occurs when a more
skilled and experienced person, is able to
both challenge and support in equal
measure a less skilled or experienced
person.
Mentoring functions within the context
of set goals or defining well-formed
outcomes for the mentee with an end date
for their achievement.
Mentoring is an organisational process
that creates value in your business, i.e.
one that is independent of individuals and
repeatable by others, or is your people?
One way to find out which is the case is to
ask the following questions:
1. Does your organisation have valuable
64
DIGITAL LEADERS 2015
2.
3.
4.
5.
intellectual property that is vested in
a small number of employees (i.e. in
their heads?)
How exposed are you to people leaving
your company – does it affect your net
worth or goodwill?
Have you got the capability to bounce
back if key workers retire early or
move on, potentially to new entrants
or rival organisations?
Have you got individuals, potentially
frustrated entrepreneurs, in your
organisation who are waiting to
disrupt the market in which you operate?
Do you have a safe pair of hands who
can assist you to [ethically] access
employee thinking, especially that of
your star performers?
If your answer to these questions is mostly
‘yes’ then your business value is likely to
be very dependent on people. To keep this
‘resource’ safe one of the business
contingency measures you can put in
place (with easily accessible tools these
days) is a mentoring scheme.
Firstly, you can set organisational
objectives for the scheme, which include
you getting to hear discreetly the
‘grapevine’ information that you need;
there’s nothing unethical about this
suggestion.
You will need to set parameters
on confidentiality and, in parallel, do
everything you can to increase feedback
on how to secure your employees’
investment in you as an employer. As a
result you will be clearer on what you
need to do for them and what they can do
for you in value creation.
There can be a bit of snobbery around
the calibre of mentor that an individual
receives access to so be careful about this;
mentoring is not a superficial process. Get
your best heads together and cushion any
fears mentors have about sharing their
knowledge; reward the behaviour both in
public and privately.
Get some advice on the eligibility
criteria that you are going to set for your
mentoring scheme so that it is open to all
that meet the criteria. Make it clear that
outcomes will be measurable and tracked.
Communicate the personal development
available to all employees at the same
time.
3. Mentoring and coaching can take
place in the workplace, potentially
negating the need for employees to
be absent from the workplace such
as on residential training courses. The
mentee can take on challenges with
the sponsorship behind them whilst
receiving feedback on their
performance and build their
confidence in new tasks or roles.
When a large stretch is involved for an
individual, they can be protected from
any unfair criticism whilst receiving
specific, evaluative feedback to
improve skills and confidence in new
roles.
Reasons to mentor
Organisations have many reasons for
•
Promoting equality
The key to success of mentoring across
differences or cross-cultural mentoring is
that organisations have wider responsibilities
for promoting equality by:
nsuring that the pool of people being
e
considered for promotion and key
This is an item that needs to be on the C-suite
agenda, not solely left to HR and line managers
to formulate and implement.
introducing mentoring schemes:
1. Mentoring schemes have been used
by organisations as evidence that they
comply with equality and diversity
legislation. There are legal, political
and employee-relations risks associated
with non-compliance with equality
legislation, as well as the opinion of
your customers or consumers.
2. When the capability of the organisation
needs to change, individuals may
need to be upgraded to enhanced
roles with more developed capability
to fill the gap. Mentoring can augment
in-house other interventions or
support resources. Mentees can
enhance their credibility, confidence
and competencies whilst receiving
career advice and support.
•
•
assignments reflects the diversity of
the organisation; so eligibility criteria
must be well-crafted;
promoting and addressing equality
and diversity issues;
challenging the stereotyped notions of
the capabilities (or obtainable career
transitions) of people from particular
backgrounds or gender differences.
The purpose behind a scheme can be
renewed and adapted as needs change
over time.
Stage by stage implementation
The stage by stage implementation of a
mentoring scheme might look like this:
1. Development of a brief that
establishes the organisational needs
and benefits plus a consultation
process to obtain anonymous feedback
on the concept and its implications.
2. A feasibility study of the implementation
requirements, particularly if there is to
be supporting infrastructure, software
applications and integration with other
people management and development
tools. A pilot scheme may be realistic
and acceptable.
3. Development of a business case
identifying the organisational structure,
scope, constraints, business and
technical options with justification,
risks, costs and benefits. The likely fit
of accountabilities within the
permanent organisation structure is
important so that any virtual
organisation established as a project
or programme has the authority and
sponsorship required to drive it to
successful completion.
4. There are different types of engagement
schemes: Low mentee engagement
• a pal or buddy scheme;
• group mentoring;
• informal one-to-one coaching
schemes;
• networking activities.
High mentee engagement
• one-to-one mentoring;
• peer/role model;
• objective setting;
• very focussed on mentee;
• networking activities.
One of the key things to identify is that
there is usually a cost for doing nothing
such as loss of valuable intellectual property
if your talent decides to go elsewhere –
you need to cost the business exposure to
losses.
This is an item that needs to be on the
C-suite agenda, not solely left to HR and
line managers to formulate and implement.
You have been warned!
2015 DIGITAL LEADERS
65
LEARNING AND DEVELOPMENT
PROCESSES
AND PEOPLE
Image: iStock/477022353
Dr Janet Cole MBCS, Nick Fernando MBCS and Dr Chris Littlewood of the BCS Business Change Specialist
Group discuss why organisations need process learning hubs.
The current business climate demands
organisational agility and fully leveraging
the knowledge capital of its workforce to
maintain a business competitive edge.
This is likely to continue and become more
demanding for the foreseeable future,
hence the impetus for big data projects
costing millions of pounds.
If organisations are to enhance their
competitiveness then better mechanisms
are needed to capture employee
collaborative learning in situ; to use it to
improve productivity, the efficiency of its
business processes, and retain and share
the generated know-how unique to each
organisation (Fischer, 2011)1 amongst its
employees to better support their working
practices.
While we know that the majority of
work-based learning takes place doing
the work there are on-going challenges to
providing situated training when there is
any change to business processes or when
new employees join.
A core issue and challenge is how most
training is currently delivered in the workplace. For example:
•
•
66
ace-to-face training sessions are
F
usually delivered to small groups
either in-house or off-site.
This may be costly in terms of
resources such as time and money,
plus it may not be suitable for all
trainees in the group, considering their
DIGITAL LEADERS 2015
•
•
•
•
prior knowledge and specific work
needs.
E-learning provides greater flexibility
in delivery because it is usually on a
one-to-one basis.
However, depending upon course
content and structure it may prove
inadequate to meet individual specific
work needs.
Collaborative learning is the most
challenging to capture and deliver. So
typically the learner receives in situ
training from one of more members in
a work team or while watching and
listening to what goes on around them.
Capturing collaborative learning in
situ while the work is being undertaken
between workers is the most
challenging for training. Yet it is
essential that the collective pool of
workplace knowledge is gathered,
shared and grows. True learning
support for collaborative worker
actions has been the holy grail of
e-learning developers, software and
usability designers.
We must always be aware that employees
rarely work in isolation. The work they
receive as inputs are the outputs of
someone else’s work; the work they
produce as outputs become the inputs for
another person’s work.
Together they work to fulfil part of
a much larger organisational business
process that may span one or several
departments, both locally and overseas.
In trying to do this there is a lot of
‘active’ learning taking place: on the job,
through social exchange and possibly
some training. A very large part of this
learning is collaborative. The learning and
application tend to happen at an intense
pace to fulfil the process and improve
it where possible. All of this happens
iteratively.
Individual members that make up a
team will carry out either common or
specialised tasks as part of a business
process. To complete the task the team
works together, pooling their knowledge
resources to become collective knowledge
across the work environment.
Sharing learning to optimise business
processes
A competent team would also be iteratively
optimising the process and sharing best
practice: learning in the improvement
loop. A lot of this ‘know how’ is generated,
shared and consumed across a range of
media and methods. Conversations,
documents, files, links, comments, searches,
courses, meetings and so on mean that
the collective in situ way of working and
learning does not happen in isolation, but
is decentralised. It is fragmented and
disorganised; and some of it may get lost.
Effectively capturing this ‘learning loop’
and its output is essential to being able to
work with a process and make continuous
performance improvement. We also
need to consider that a team structure is
dynamic, with staff joining and leaving the
team, and large activities being distributed
across multiple teams and outsourcing/
off-shoring (Brown, 2012)2.
The learning interventions we design
need to reflect this process orientation.
More so, they must meet the competence
needs for the processes while also being
tailored to the individual proficiency
levels of the employee. For example, if
an employee is proficient at 50 per cent
of a task and they undergo face-to-face
training, then 50 per cent of the training is
a waste of limited resources: financial and
the time lost impacting on work that might
otherwise have been done.
There is then the issue of how
quickly an employee engages with such
interventions, how well-matched their
development requirements are with the
training offered, how long they take to
complete and how quickly they can be
applied back into the workplace. These
are important issues in realising a more
the process is refined. The result is a
customised body of knowledge closely
aligned to the process or ‘way of working’.
New team members can use it to learn
the necessary skills to effectively fulfil
processes, existing staff can use it as a
reference or support tool. The system
would also adapt what is presented to
an employee based on familiarity and
competence with the process.
Automating data collection to refine
collective user learning
The ‘process learning hub’ would store
the activity of users across their bespoke
content set. This sufficiently large body of
data can then be used to understand user
behaviour, trends, problem points and
possible process issues.
Importantly, such a data set can assist
in determining specific performance needs
and anticipate how content is interacted
with to optimise structure and relevance
(Mayer-Schönberger and Cukier, 2014)4.
This would be automated through
We must always be aware that employees rarely
work in isolation. The work they receive as
inputs are the outputs of someone else’s work.
personalised and adaptive approach to
achieving corporate learning outcomes
(Learning Solutions Magazine, 2014)3.
Introducing a process learning hub
Considering the range of user-generated
content and disparate training available,
the preferable solution would involve a
system that can mediate a ‘process learning
hub’. It would allow a centralised body of
content, such as a training course or
knowledge base, to be customised by
employees to deliver a more relevant
sequence of learning and performance
support to complete a process.
The content in the hub can be further
edited by the members of a team
to update content and best practice,
effectively refining the content as
machine learning algorithms.
As more staff engage with and use the
hub, the iterative algorithm would improve
the selection and delivery of content,
providing employees with just what they
need to learn to complete processes using
collective know-how.
Emerging opportunities for work-based
learning
It is long overdue that learning
management system vendors incorporate
adaptive learning technology into their
systems and some now can.
A subset of such vendors also provides
integration with other platforms such as
Microsoft SharePoint and Alfresco, which
offer social content management and
communication. The kernels of a ‘process
learning hub’ could likely already be in your
organisation.
Organisations such as yours can realise
greater value from the resources you
already have and maximise efficiency of
your business processes by providing
‘process learning hubs’.
Computer Supported Collaborative
Working (CSCW) and Learning (CSCL) need
to facilitate collaborative learning that
is adaptive so that a team can improve
work practices and processes (Goggins
et al, 2012)5. This has greater value if the
learning can be shared in situ within a
group as it works through daily routines
of analysis of data-hoc problem-solving
scenarios.
References
1. Fischer, G. (2011). Understanding,
Fostering, and Supporting Cultures of
Participation. Interactions, May-June, ACM.
USA.
2. Seely Brown, John (2013). Foreword.
In: Sean Goggins, Isa Jahnke & Volker
Wulf (Eds.): Computer-Supported
Collaborative Learning at the Workplace:
CSCL@Work. New York: Springer, pp. v
– viii.
3. Learning Solutions Magazine, (2014).
Skillsoft and IBM Research: Harnessing
Big Data in Enterprise Learning by News
Editor: Learning Solutions Magazine.
[online] Available at:
http://www.learningsolutionsmag.com/
articles/1342/skillsoft-and-ibm-researchharnessing-big-data-in-enterprise-learning
[Accessed 22 May. 2014].
4. Mayer-Schönberger, V. and Cukier,
K. (2014). Learning with Big Data: The
Future of Education. 1st ed. Eamon Dolan/
Houghton Mifflin Harcourt, p.16.
5. Goggins, S., Jahnke, I. and Wulf, V.
(2012). CSCL@Work revisited – CSCL and
CSCW? Are There Key Design Principles
for Computer Supported Collaborative
Learning at the Workplace? In GROUP ’12,
Sanibel Island, Florida, USA.
2015 DIGITAL LEADERS
67
LEARNING AND DEVELOPMENT
E-LEARNING
BUSINESS ANALYSIS
Image: iStock/477022353
Julian Fletcher MBCS provides an overview of how business analysis techniques can be applied to
ensure online learning materials provide a cost-effective and efficient means to deliver the training an
organisation’s staff need to know about their products, processes and regulations.
The last decade has seen business analysis
emerge as a major subject discipline.
Business analysis (BA) is the application
of techniques to enable the development
of systems that better support business
objectives.
A simplified business analysis process
model for developing such learning
materials is to identify the gap between the
client’s existing ‘as is’ business model and
its desired ‘to be’ business goals for which
the prototype learning material is written
to help deliver.
This process depicts how a client will
often have some business targets they
need their staff to meet and that some
form of training is required to make their
staff suitably equipped to achieve them.
A business analyst’s task is to ascertain
what the client’s staff training needs
actually are through carrying out a ‘gap
analysis’; in other words identifying the
gap between the staff’s current level of
knowledge and the level they need to get to
in order to achieve their set targets. Once
this gap has been identified, modelled and
documented, a set of requirements can be
drawn up and used to produce prototype
learning material in a format that can be
reviewed and incrementally refined in
collaboration with the client (in an agile
manner) until it is fit-for-purpose.
There are a number of relevant business
analysis techniques that can be used in
Client’s
business
strategy &
objectives
68
DIGITAL LEADERS 2015
this gap analysis and prototyping activity.
These will be illustrated through the use
of a case study exercise in which the client
has requested that the BA design learning
content to teach Chinese business culture
to the client’s salesmen.
1. Investigating the ‘as is’ situation
First of all the BA seeks to identify all
those stakeholders who are in some way
impacted by the learning content to be
developed.
These stakeholders can then be
included in a Peter Checkland ‘rich picture’,
which informally captures, in sketch/
cartoon format, the existing business
context in which an e-learning intervention
is to be deployed. For this example: the
client and their salesmen, the target
Chinese market and the learning material
and its desktop, tablet and mobile delivery
mechanisms would all be sketched.
Through a combination of one-to-one
interviews and stakeholder workshops,
with both the client and staff trainees,
the BA can gain further insight into this
business context and use this to develop
training that is relevant, engaging and
sympathetic to the learning styles of the
trainees.
It is also important for the BA to
carry out sufficient research to acquire
the requisite domain knowledge of the
subject matter to be taught. In our Chinese
Investigate
& model the
existing as is
situation
The Gap
business example that would include
understanding how to apply the business
etiquette of ‘Guanxi’ (networking) and
‘Mianzi’ (face/social status) in negotiating
sales.
The ‘context diagram’, uses a UML
(Unified Modelling Language) notation to
represent the proposed learning content
(as an IT system) in relation to the wider
world (the stakeholders and other systems
interacting with it) based on stakeholder
interviews, workshops and the BA’s
acquisition of domain knowledge of the
client’s business.
In the context diagram, the learning
material is regarded as a ‘black box’
with the internal details of how it is to be
defined and constructed, in terms of its
curriculum and content, yet to be defined.
Having drawn such a diagram the BA
can then review it with the stakeholders
(referred to as ‘actors’ in UML and
depicted as matchstick characters in the
diagram) to check that it does indeed
represent their proposed training needs.
2. Eliciting the requirements for the ‘to
be situation’
Now that the existing context for the
proposed learning solution has been
analysed, attention can be turned to
techniques used to specify the
requirements for this learning - the
desired ‘to be situation’.
Elicit
requirements
of desired to
be situation
Produce
prototype
learning
material
To begin with, a short scenario can be
written to describe the desired impact
of the learning solution, encapsulating
the stakeholder perspectives. Peter
Checkland’s CATWOE (customer, actor,
transformation, world view, owner, and
environment) can be used as a framework
for writing this:
‘The e-learning solution is a system
purchased by the client (O), to enable his
sales staff (A) to carry out online training
interaction and flow of information, which
will form the ‘blue print’ for the learning
material.
The basic flow (and alternative flows
where applicable) of use cases can be
presented in further detail in the form of
the ‘swim lane diagram’ used to model
the online assessment activity of a trainee.
This diagram, based on a swimming pool
metaphor, shows the process steps carried
out by a user to complete their assessment
It is important for the BA to carry out sufficient
research to acquire the requisite domain
knowledge of the subject matter to be taught.
(T) to help them make sales in China (W) by
familiarising themselves with the cultural
aspects of doing business in that country
whilst living and working there (E)’.
Individual user stories could then be
written from the perspective of each of
the stakeholders, to provide further detail.
Once these stories have been written and
agreed with the client, the BA can add
the internal details to the aforementioned
‘black box’ in the Context Diagram, so that
it is transformed into a UML ‘use case
diagram’, with each use case (displayed
within the boxed system boundary)
representing a piece of functionality, used
by a stakeholder.
Each of the individual use cases in the
use case diagram can then be broken
down into more detailed steps to show the
with the steps being distributed amongst
the appropriate swimlanes representing
the user, user interface and data store.
The use cases such as the client’s
supervision of training can also be used
as the basic building blocks to model the
system data for the e-learning training
through incorporating them within ‘entity
relationship (ER) diagrams’ and/or ‘class
models’.
The ‘ER model’ represents this use case
as a series of connected ‘entities’ each
of which the client wishes to collect and
store data about e.g. trainee scores (in this
example the entities are ‘client supervisor’,
‘online assessment’ and ‘trainee’).
This ER model can be used as the
template for the training system’s
underpinning database.
The ‘class model’ consists of three
classes (or types) belonging to the use
case. These classes represent the client,
sales trainee and assessment respectively
and form the templates for producing the
actual use case objects (occurrences). Each
class is named, contains a series of data
items or attributes constituting its stored
properties and has a series of functions or
operations that it can carry out.
Where appropriate, this class model
can be used to implement the data model
for the training system using a high-level
object-oriented language such as C#, Visual
Basic or Java.
3. Producing the prototype learning
material
Having modelled both the ‘as is’ training
requirements of the client’s business and the
‘to be’ goal for their sales staff to win
business in China, prototype online
training to bridge the gap between the ‘as is’
and ‘to be’ situations can now be built and
reviewed by the stakeholders. Their
feedback can then be used to incrementally
improve the prototype. This is an example of
agile development. A wireframe
application can be used to ‘mock up’ the
initial look and feel of this training from which
a more polished final version can evolve.
Summing up, the judicious application
of business analysis techniques can help
to ensure that e-learning courses are
developed to fully realise those desired
business benefits.
2015 DIGITAL LEADERS
69
PROJECT MANAGEMENT
Image: iStock/494374653
AGILE EVOLUTION
The Agile Manifesto was written in 2001 and agile methods have
seen widespread adoption since then. Many organisations have
implemented agile methods in their software development
departments. These include management processes such as sprints
and scrums, to more technical practices such as automated unit
testing and continuous integration. With this in mind Robert Cowham,
MBCS, Vice Chair of the BCS CMSG, discusses continuous delivery
and the state of DevOps.
70
DIGITAL LEADERS 2015
The initial results were often very positive,
but organisations typically realised that
the gains were only in part of the life cycle.
Even with developers delivering regular
sprints of new functionality, processes
such as testing and release/deployment
would cause bottlenecks and delays.
Thus releases to end-users or
customers were still only happening
every few months at best and the
organisation was not getting much overall
benefit. Forrester’s 2013 report on the
development landscape showed the
stalling of agile practices and difficulties in
attempting to scale them.
Addressing these issues gave rise
to DevOps and continuous delivery (CD)
practices, with a focus on how to optimise
the whole lifecycle.
The term DevOps was first coined in
2009 and started gaining momentum in
2010. The book Continuous Delivery by
Jez Humble and Dave Farley was written
in 2011 and was based on some years
of best practice experience in different
organisations. It covers many different
effective technical practices, ranging from
version control to managing environments,
databases and other aspects of
automation of the development pipeline.
In a sense these approaches are going
back to some of the roots of agile methods
in lean manufacturing principles. I prefer
the term CD because it reduces the risk of
ignoring important aspects of the whole
ThoughtWorks, and freely available)
describes the relationship between strong
IT performance and business goals such as
profitability, market share and productivity
-not perhaps surprising considering Marc
Andreessen’s 2011 proclamation that
‘software is eating the world’.
For example, mobile operators,
financial services providers or engineering
companies may not sell software, but
they certainly depend on it. With the
internet of things (IoT) or the evolution of
hardware vendors into software-centric
manufacturers (hardware is the new
software), the importance of IT is only
increasing.
The other key factors highlighted in the
report are that DevOps practices improve
High-performing organisations report deploying
code 30 times more frequently, with 50 per
cent fewer failures than other organisations.
– it is more than just about improving the
cooperation between development and
operations, although that is very important.
Delivering new releases of software or
new products requires coordination with
other parts of your organisation as well,
such as support, marketing and sales.
However, it is the principles that are more
important than the precise name!
The state of DevOps
The ‘State of DevOps’ 2014 report1 (by
Puppet Labs, IT Revolution Press and
IT performance and that organisational
culture is also a strong predictor of IT
performance. An interesting finding, and
though perhaps less surprising given the
developer-lead agile method adoption, is
that job satisfaction is the number one
predictor of organisational performance!
High-performing organisations report
deploying code 30 times more frequently,
with 50 per cent fewer failures than other
organisations.
The key practices that underpin such
performance are:
•
•
•
•
ontinuous delivery – this means
C
ensuring that your software is always
potentially releasable (the business
needs to decide separately how often
to actually release, taking into account
the ability of customers or end users
and other supporting processes to
handle releases).
Version control of all artifacts – this
includes environments (treating
infrastructure as code). Interestingly
version control of production artifacts
is a higher predictor than version
control of code – perhaps showing the
issues and complexity around managing
configurations.
Automation throughout the lifecycle,
particularly of testing, including
acceptance and performance testing.
Good monitoring of systems and
applications.
A report by Evans Data commissioned in
2013 reported that 28 per cent of companies
were using continuous delivery for all of
their projects, but the same companies
thought that 46 per cent of their competitors
were doing so – no one wants to be left
behind!
The benefits for organisations can be
considerable. Being able to release more
frequently, and with greater reliability
and predictability, allows the business to
experiment and evolve towards satisfying
real customer needs more quickly and
efficiently and with less risk.
By delivering software faster into
2015 DIGITAL LEADERS
71
PROJECT MANAGEMENT
Making progress on the CD journey
So how are organisations successfully
implementing CD practices?
The initial driving force behind DevOps
and CD tended to be smaller companies
with modern ‘on the web’ infrastructures.
People on small teams are more likely to
be multi-skilled, which makes it easier to
apply things like development practices to
operations. These include use of version
control and treating infrastructure like
code with automated tools and testing
processes. It can be quite a challenge to
change the mindset of administrators from
‘I’ll just login and fix that’, to ‘I’ll make a
change to a configuration file, test it locally
and check it in to version control, before
running my deployment tool to propagate
the change.’
It became clear that the same principles
could also scale to larger companies, in
spite of some challenges. Organisations of
any size will need deep technical expertise
in areas such as database administration,
technical architecture, testing, security,
system administration and operations.
72
DIGITAL LEADERS 2015
Traditionally these skills reside in separate
teams, often with quite widely separated
reporting structures.
As with most such improvement
initiatives, changing people’s mindset is
often the biggest challenge. This requires
both bottom-up activity and enthusiasm
to improve individual processes, and
also top-down management support.
The management support is required to
overcome issues such as organisational
inertia, reporting structures and how
progress is measured and people are
rewarded.
Coordination between teams is vital
to CD, so matrix-type organisation with
representatives of different technical
teams being brought together for
particular projects has been shown to
work well. Assigning experienced and
respected people within your organisation
to any such new initiative is important.
They need an appropriate mandate and
ability to work across the organisation and
pull together teams to solve problems.
Focusing on improving or automating
workflow and handovers between teams
is often a major win.
While there are plenty of technical
challenges to achieving CD, there are
also solutions, proven approaches and
supporting tools, and steadily more
successful case studies being reported.
The role of architecture is fundamental
for longer-term success. Appropriate
layering and interfaces can help to support
practices such as automated testing.
While existing systems may not have been
designed with this in mind, it is important
to come up with an architectural vision
and look to evolve, even if it takes years.
Large companies with mainframe-based
systems with decades of development
have managed to do this using approaches
such as adopting a services-oriented
architecture.
It is beneficial for all teams to improve
their individual practices, for example by
increasing automation within their team
and ensuring that the many different tools
in use through the life cycle can co-exist
and work together.
Tools range from requirements
management through design, coding
and testing to continuous integration
and automated deployment. Automation
capabilities need to be a key selection
criteria for any tool, although increasingly
many have web-based application
program interfaces (APIs).
As previously mentioned, ensuring that
all assets are in version control has a
very high correlation with IT performance
and is a key foundation for other tools.
Change tracking, audit trails and impact
analysis for things like risk assessment
are all driven from this foundation. It is
very important to ensure that any tooling
in this area will scale appropriately and
perform adequately to meet the needs of
the organisation.
There are some powerful
workflow tools available to help with
implementation, but they need to be
used with care. While they may improve
visibility and improve consistency,
implementation may not always be
optimal overall. One financial services
client I worked with recently had
be done in less than an hour. The tool itself
was very capable, but it needed to be used
in the most appropriate manner.
There are some powerful workflow tools
available to help with implementation, but they
need to be used with care.
Image: iStock/180521689
the hands of end-users, the time to get
customer feedback is reduced. Somewhat
paradoxically, doing this reliably and
consistently usually results in higher
quality. This is a side effect of the mantra –
‘if it hurts, do it more often!’
Of course there are many challenges
to implementing such methods and it is
always going to be harder in organisations
with decades of investment in their
software systems.
implemented ServiceNow, and creating
a new environment for testing involved
some 20 different sub-tasks. These were
queued and assigned to different teams
as appropriate, with automated onward
queuing after each step. In this case, some
tasks took only a few minutes to complete,
and yet the delay in noticing the request
on the queue and processing it sometimes
meant that hours passed between steps.
Thus overall it often took two to three days
to fully create the environment, when - if
the whole thing were automated - it could
Aggregating marginal gains
Another way of looking at CD is applying
the continuous improvement methods that
helped make the British cycling team so
successful under the leadership of Sir Dave
Brailsford. At the Beijing Olympics in 2008
they won seven out of 10 gold medals on
offer in the velodrome (plus one out of four
golds on the road), and four years later did
exactly the same at the London Olympics.
This was an outstanding acheivement
by a single country, and it is not as if
the other countries were not trying their
hardest to improve in the intervening four
years! The approach was to break every
little action down and attempt to improve it
even by only one per cent, which resulted
in a significant overall gain when put back
together.
Applying this sort of mindset to your
overall software development will reap
dividends and provide a significant
competitive advantage for those companies
who make the effort. It is about optimising
individual parts, but also keeping an eye on
the big picture and optimising the whole.
Where are you on your journey towards
continuous delivery?
References
1 http://puppetlabs.com/sites/default/
files/2014-state-of-devops-report.pdf
2015 DIGITAL LEADERS
73
PROJECT MANAGEMENT
DEATH OF
PROJECT MANAGEMENT
Bryan Barrow MBCS considers the current issues in project management, and compares them with a
conundrum that was faced by the scientific community last century.
When President Dwight D Eisenhower
suffered from a heart attack in 1955, it
pushed the issue of heart disease up the
political agenda in a way that few other
things could.
The challenge for scientists and doctors
back then was to try and explain what
caused dietary heart disease. For the next
ten years idea after idea was analysed and
scrutinised by the academic community.
The possible causes coalesced into
two rival camps. One camp argued
passionately that fat was the underlying
cause of heart disease. The other camp
claimed with just as much certainty that
sugar was the culprit. In the end those
arguing that fat was the problem won out
over their bitter rivals.
74
DIGITAL LEADERS 2015
As a result public policy has been
shaped around three beliefs. First, that
fat is the enemy, not sugar. Second, that
the reduction of fat from the diet must be
a key weapon in heart attack prevention.
Third, that the path to the eradication of
dietary heart disease should be based on a
diet low in fat and high in carbohydrates.
In the forty years that followed, since
the early 1970s, we have witnessed the
relentless rise of obesity, the continuing
carnage caused by so-called ‘low-fat’ diets
and the slow, steady death of millions of
people who might otherwise have gone on
to live active, healthy lives. You only have
to plot a graph of the proportion of fat in
the diet alongside the proportion of the
population suffering from obesity.
The proportion of fat in the diet has
fallen consistently. The proportion of the
population suffering from obesity has
rocketed relentlessly throughout that entire
time. Forty years spent heading in the
wrong direction has resulted in a situation
where obesity now costs the UK public
purse almost fifty billion pounds a year.
And so it is with project management,
where I believe we have spent not just
vast sums, but also the last four decades,
wrestling with the results of incorrect
conventional thinking.
The challenge for project management
40 years ago was the question: ‘Why
do so many projects fail?’ As with heart
disease we in the project management
community thought we’d identified the
chief culprit: Poor project management
was to blame and since the 1970s we
have invested heavily, both intellectually
and financially, in the idea that if we fix
project management then we will deliver
projects more successfully.
We have followed that path for the
past 40 years. We have developed and
adopted bodies of knowledge. We have
created and promulgated methodologies
for project management, programme
management, portfolio management
and risk management, which are now
seen as fundamental foundations of the
professionalisation of project management.
All of this is welcome, but I believe that
we have been following a path that has
ultimately proved to be the wrong one.
The other path was the one towards better
project leadership.
Why do I say this? If you look at the
vast sums of money spent on project
management certification and training
and plot that against project success rates
you’ll see that the trend lines show no
that you could describe leadership, but
from a project management perspective
it is associated with power, influence,
persuasion, campaigning, direction and
motivation.
Now I am not for one moment arguing
that projects don’t need managing. I am
arguing that, in order to deliver more
successfully than we have done, we need
to focus more on leadership. Support for
this view comes from two sources.
The first is from a Project Management
Institute (PMI) survey from 2013, which
asked what the most important skills
to successfully manage highly complex
projects in organisations were. ‘Leadership
skills’ was the overwhelming choice, with
81 per cent of respondents citing it as
the most important. ‘Technical project
management skills’ came in at a paltry
nine per cent
The second is from my own experience,
not just as a project consultant and coach,
but also as speaker and conference chair
for project management events. In this
We must spend as much time as we dare
developing relationships with and between
stakeholders.
correlation between investment in project
management certification and training and
project management outcomes over the
past 40 years.
Let me be more specific about what I
see as the differences between project
management and project leadership.
Project management is that branch of
management devoted to the process of
change; the orchestration of people and
resources through a series of activities,
tasks and processes to bring about a
transformation from one state to another,
different state.
Leadership is that branch of
management concerning the development,
nurturing and utilisation of influence
as a way of motivating others to some
end or objective. There are many ways
latter role I have spoken to hundreds of
speakers and thousands of delegates.
Time after time their message is the same:
we need to focus on the softer skills of
stakeholder management, on vision setting,
on ensuring that we deliver business
change alongside project change, so that
we make change stick.
As we enter 2015 I still believe that there
is more to be done to address the historical
imbalance so that we focus more on
project leadership. Let me give you three
reasons why I believe this to be the case.
Failure to execute strategy is still seen
as a project management issue
First, there’s a big problem in business
concerning the execution of strategy and
business leaders see this as their number
one challenge. It’s relatively easy for
them to define a strategy, much harder to
deliver it. Somewhere, something is going
wrong between vision and execution and
many still see it as the failure of project
management to successfully deliver
strategy. I don’t agree. This is a leadership
issue, not a project management one.
We need to develop leadership potential
instead of systems and tools.
Second, there’s a real need to develop
the leadership potential of project
managers. If leadership is concerned
with influencing and motivating people
then project managers need to be skilled
leaders, not just successful administrators.
Unfortunately too much of the
discussion around project management
skills is still rooted in hard skills and
techniques: planning, scheduling, risk
management and budgeting. Instead, it
should be focused on those skills that
are vital for leadership: envisioning, goalsetting, listening, speaking, collaborating,
negotiating, supporting and reinforcing,
coaching and mentoring. If we don’t
address the need for better leadership now
with new and aspiring managers, we will
face a future where the business leaders of
tomorrow have poor leadership skills.
We need to focus on stakeholder
management and business change, not
just faster delivery.
When stakeholders are poorly aligned,
when they are ignored or when their needs
are not met we create the conditions
for doubt, for conflict, for rejection. We
must spend as much time as we dare
developing relationships with and between
stakeholders.
We must spend as much time as we
dare understanding their needs, their
concerns and their goals. We must spend
our time working to unite people in support
of projects so that they feel a part of the
change that is to come and support it.
It’s time we returned to the right path
towards project success. So I say death to
project management and long live project
leadership.
2015 DIGITAL LEADERS
75
PROJECT MANAGEMENT
of a project.
From our work across a range of clients,
we’ve identified five key mindset and
behavioural changes to help your test team
become a key enabler of project success.
1. Focus on why a change is being made
to keep end users at the forefront of
everyone’s mind when designing tests.
Adopting a user’s perspective creates
fewer, more valuable tests that verify a
larger business benefit, rather than
endless technical or functional scripting.
These low-quality tests often arise when
scripting is based on the ‘what’ and the ‘how’
of the changes being implemented. For
example, ‘can a customer search for, and
find, products that are currently on sale?’
is a simpler, clearer and more meaningful
test case than one that searches for every
discount range or every product. Most of
all, it enables your testers to implicitly look
at the human elements of business and
discount voucher code’.
3. Embed the test team throughout the project
lifecycle to help everyone in the team to
continuously improve delivery and quality
for each other and at all stages of the project
lifecycle. The test team will then be able to
support every member of the project, helping
them to create a product that meets the needs
of those it is being produced for, with each
of the deliverable items ultimately working
towards the business and operational benefits
of the project as a whole. There is a positive
multiplier effect when testing helps others in
the project.
For instance, a tester partnering with
and coaching your business analysts during
the design phase can prevent them from
misrepresenting a requirement that is then
further misinterpreted by your developers
down the line. Before you know it, you could
be reworking months of effort to solve
something a trained tester would easily spot,
Supporting your test team to shift their focus
from solely finding defects to predicting quality
in your projects will make a real difference to
how the team is valued.
TESTING TIMES
With the increased cost and complexity of modern IT estates, effective testing plays an ever increasing
role in a successful project launch. With this in mind, Frank Molloy and Mark Carter, IT experts at PA
Consulting Group, explain how to add value during testing times.
The mindset and behaviours of project
test teams can greatly influence a project
outcome, and with some nurturing your
test team can evolve from a necessary cost
centre to an enabler of high-quality service.
Without a strong sense of direction,
however, or the opportunity to demonstrate
the value they are capable of providing,
76
DIGITAL LEADERS 2015
test teams will struggle to change
perceptions. A negative perception of test
team performance can lead to slower
delivery, increased cost or delivering the
bare minimum. This kind of reputation
will continue to drag down a test team’s
productivity and perceived value long after a
project ends. But this need not be the case.
Supporting your test team to shift
their focus from solely finding defects
to predicting quality in your projects will
make a real difference to how the team is
valued. Making that change isn’t about a
new technique or methodology, but about
developing a mature, experienced team that
is trusted to positively influence the delivery
user engagement that are difficult to script
for and shouldn’t be left until a dedicated
usability phase.
2. Inform business and change owners in
terms that are real to them to allow for
greater accuracy in predicting meaningful
quality. Metrics that tell stakeholders how
a change will impact them will give them
the confidence to make an educated
decision about go-live.
For example, telling a product owner that
nine out of 12 e-commerce tests have been
completed with one open defect is far less
valuable than being able to tell them that
‘customers are able to search for products,
add them to their baskets and complete their
purchase with all major credit card types
– except where the customer has added a
and at a significantly lower cost.
4. Tailor and adapt test validation and
verification techniques to ensure a lighttouch, but thorough, approach. This is a
critical balance that test teams need to
manage effectively to avoid being seen as
an expensive or bureaucratic mechanism
that gets in the way. It is the test team’s
responsibility to understand their stakeholder’s appetite for risk and to use an
appropriate amount of effort to deliver a
level of quality that meets the agreed needs
of the business.
To do this, test teams need to know what
their remit is, what good looks like for the
business and feel able to speak up if things
aren’t right or create an unacceptable risk.
Combined with communicating in
ways that are meaningful, test teams can
accurately describe the level of risk faced by
varying approaches and reach a consensus
on approach and risk amongst all levels of
stakeholders.
5. Empower your test team to act as the
business’ quality assurance partner by
being engaged, honest, open and upfront.
This is probably the most meaningful
thing a test team can do for a project as it
makes the most effective use of your test
team’s abilities and frees other groups
within a project to focus on theirs.
Sometimes this will mean they will have
to say no, recommend that you abandon a
work stream and go back to the drawing
board. Listen, understand why they’re
making that recommendation and don’t
dismiss the rationale because you think
they are being pessimistic or that it will
cost too much – they’re probably right and
are just trying to limit any more exposure
in the long-term.
Heeding their advice and stopping
doesn’t mean failure, but not allowing your
testers to fulfil their potential and to deliver
without allowing them to add value does.
Ultimately, not using your resources to
their full potential is how projects fail.
When considering the right approach to
testing, there is a large variance in the
mindset and capabilities that test teams
can demonstrate. Adopting the ‘how can
I break this code?’ mentality adds limited
business value, whereas actively seeking
to introduce quality at all stages of project
delivery is much more beneficial.
Test teams who demonstrate the five
qualities outlined above are the equivalent of
artisans and scientists, but they can’t reach
that stage of maturity if not supported by
senior management.
Educating your test team on the business
benefits you’re looking to achieve will allow
them to communicate meaningfully and
support the business in a way that allows
them to reach their full potential and deliver
value in your eyes and their own.
2015 DIGITAL LEADERS
77
PROJECT MANAGEMENT
TWO SPEED IT
Steve Burrows FBCS, Managing Director, SBA, compares and contrasts two popular project management
methods: Waterfall and Agile.
It is not only IT that has changed
dramatically over the past 50 years;
business has changed too - and a lot of
that change has been driven by us in IT.
Our technologies have opened the door
to globalisation and accelerated changes in
business process and practice, massively
shortened product life cycles as globalisation
has introduced new foreign entrants
into our home markets and enabled us
to outsource labour-intensive work into
cheaper economies. All this has changed the
expectations and demands of the customers
for our development products.
Modern business now typically develops
new products in months instead of years,
and commonly changes services and the
business processes that support them in
weeks instead of months.
The rate of change in business has
undergone several step-changes over
the past few decades, and in parallel
business adoption of and dependence
on our IT systems has deepened - IT is
now commonly business-critical 24 x
365 and it has become difficult to make
changes to business without implementing
corresponding changes to the IT systems
that support it.
In the face of this increased dependence
on IT, the growth in business influence and
responsibility held by IT functions and the
78
DIGITAL LEADERS 2015
weight of expectations placed upon them
by their business counterparts, many IT
leaders have increased their use of formal
development management methods to
help them assure reliability of delivery to
the business - and they have fallen into
two camps, Waterfall and Agile.
Waterfall has been with us a long time.
The concept was described publicly over
50 years ago, and the Waterfall name
attached to the sequential development
model in the mid-70s. It is a proven model
for the development of new software
and systems, designed to ensure that
the product of development is produced
efficiently and matches the customer’s
requirements.
Many of us in the IT profession rely on
proper execution of the Waterfall method
to help ensure that our projects deliver
on our promises, captured in the initial
stages of the project as requirements. The
requirements specification is not only the
key reference tool for the development
team, it also forms the primary contractual
document between supplier and customer,
freezing at a point in time the customer’s
expression of their needs as a point of
reference against which the eventual
product may be measured.
As a method it can be argued that
Waterfall has not served us in IT well, nor
has it delighted our customers. Waterfall
projects tend to seem lengthy, and it is
common that during the development
lifetime the customer’s requirements
evolve, meaning that the end product fails
to meet customer’s evolved needs. Despite
the project successfully producing exactly
what the customer asked for, the product
disappoints and, consequently, our IT
function disappoints.
Many IT functions and developers
have reacted to this disappointment by
abandoning the Waterfall method in favour
of agile methods. ‘Agile’ has evolved over
time and, whilst it was first given the
name Agile in 2001, it had been around
for many years prior. I personally have
participated in projects run under what
would now be called agile methods since
the mid-80s. If you think about it that’s not
very long after Waterfall was codified.
Agile is primarily characterised by
its flexible approach to requirements.
Requirements are captured at the
beginning of a project, as in Waterfall,
but typically in much less detail, and are
clarified as the project progresses.
Agile projects typically progress as a
series of short iterative developments
instead of a long haul, with a pause for
breath between each iteration during
which the customer is shown the product
so far and asked to prioritise and clarify
requirements for the next iteration.
Agile is essentially development bit by
bit, assembling a jigsaw as the project
progresses instead of the Waterfall approach
of demanding a complete picture upfront.
Agile is probably no more of a recipe
for project success than Waterfall, but its
failings are different.
Properly executed an Agile development
will delight the customer because they
will get what they want, even though their
needs have changed during the life of the
project. It may, in theory, have cost more
and taken longer than originally envisaged
- two of the key measures of project
success - but customers generally forgive
these failings if the product is right.
Most commonly successful Agile
projects are shorter, less effort and
cheaper than their Waterfall counterparts
because the requirements specification
activity is much less detailed and onerous,
and also because the customer commonly
says the product is good enough when it
has fewer features than they originally
envisaged or would have specified in a
Waterfall development.
Set against these merits Agile is more
difficult to manage, commonly requires
more highly skilled and flexible developers,
can easily descend into anarchy.
Consequently we have become a divided
profession, with a predominant reliance
on Waterfall in larger organisations
and Agile being contained to smaller or
more fragmented organisations. This is
becoming a problem.
Given the contrasting natures of
Waterfall and Agile and the increased
dependence of business on IT, businesses
that adopt Waterfall are commonly
hampered and held back by the slow and
stately progress of Waterfall development
projects in comparison with competitors
using Agile. The Agile focus on producing
‘minimum viable product’ as soon as
possible, and worrying about the bells
and whistles later, enables its adopters to
deliver business change and new products
and services sooner and at lower cost.
When we look at the performance of
businesses in the market the contrast
has become starkly pronounced. Large
businesses, giants of their industries that
used to be leaders, have become followers,
trying to hang on to previously established
market share in the face of competition from
an ever-increasing array of nimble upstart
competitors that have seemingly appeared
out of nowhere with superior offerings.
When we look below the surface and
Agile is probably no more of a recipe for
project success than Waterfall, but its failings
are different.
and has fewer, weaker role definitions,
which makes the task of improving our
developers’ skills more difficult.
Waterfall, a simple and reliable
method, is still widely adopted despite
its well exposed weaknesses, because it
is achievable and manageable. Agile is a
harder discipline for the IT leader to deploy.
The bigger the organisation the greater
the contrasts between the methods
become - implementing Waterfall on a
large scale seems structured, disciplined
and managed, whereas large-scale Agile
consider what these upstarts have done
it is often clear that they have created
new products and services in shorter
timescales and with smaller budgets than
the large players could ever imagine.
The Waterfall vs. Agile method debate is
now being played out in the major industries
and stock markets around the world. The
division in our profession between Waterfall
and Agile has become material; it is no
longer a matter of management approach
and philosophy, it has become a key factor
in corporate survival, a choice between
being quick or dead.
Large corporate IT leaders around
the world over are realising the need
to become Agile. CIOs and CTOs are
succeeding or failing based upon their
ability to introduce agility into their IT
functions, and many are learning that it is
far from easy.
Agile is not a single, simple coherent
recipe, it is a philosophy. For an IT function
to transition from Waterfall to Agile is
not merely a matter of process change;
it requires a complete change of culture
and, as any decent business guru will tell
you, culture is the single hardest thing to
change in a workforce. The Agile culture
deprecates tightly defined roles, erodes the
value and recognition of personal success,
reduces certainty, stresses teamwork and
continuous improvement, and demands
constant close communication with
customers - these are huge changes
for what has become an IT profession
characterised by tightly defined job roles.
But we don’t really have a choice. As
IT leaders many, perhaps most, have
recognised that, somehow, Agile is the way.
Our business colleagues now demand IT
responsiveness that can only be provided
through the adoption of Agile, and they in
turn have no choice because the markets
they serve place similar demands upon
them. As we approach the second half of
the decade our IT leadership agenda will
be dominated by this simple ultimatum agility or redundancy, be quick or die.
The primary measure of an IT leader’s
success will, between now and the end
of the decade, not be how cheaply and
reliably they can make the IT function
deliver, but it will be how nimbly they can
make it respond to the ever-changing
needs of the business.
It is possible that the most influential
attribute of the successful IT leader will
not be anything to do with technology or
information, but their ability to change and
control the organisational and professional
culture of the IT function.
2015 DIGITAL LEADERS
79
PROJECT MANAGEMENT
LEAN PROJECTS
Image: iStock/494374653
The failure rate statistics for
IT-based projects paint a pretty
dire picture. The Standish Group
report that only 39 per cent of IT
projects deliver what they said
they would, on time and on
budget1. A BCS study reported
similar findings2. Gary Lloyd
MBCS suggests that we redesign
success based in terms of value.
80
DIGITAL LEADERS 2015
I have found that seasoned IT and
business professionals are even more
pessimistic about outcomes. During the
last year, I have been asking groups of
professionals:
‘What proportion of IT-based projects
deliver what they say they would, on time
and on budget?’
The answers are usually in the range of:
‘never’ and ‘less than 10 per cent’. Pushed
hard, I can get the audience up to 25 per
cent, but no higher.
There seems to be a weary acceptance
that this is just the way it is and that,
perhaps, the hurdle implied in the question
is a little too high anyway. After all, most
projects might be a ‘bit’ late or a ‘bit’ over
budget, but they still deliver most of the
value, don’t they?
Unfortunately not. In October 2012,
McKinsey reported that, working together
with Said Business School, at the
University of Oxford, they had accumulated
a database over 5,000 projects and
concluded that:
‘On average, large IT projects run 45
per cent over budget and seven per cent
over time, while delivering 56 per cent less
value than predicted3.’
Weary acceptance starts to look like
complacency, especially when coupled with
what psychologists call ‘optimism bias’.
We all believe that we are good at
anything that is important to us. It’s innate
and it’s the product of evolution not hubris.
business value... and the speed at which is
delivered.
That shift needs to accept that the only
way we can reliably estimate any activity
is to have previously done the same thing
before, multiple times. IT-based projects
are always novel. A new problem, a new
team, new customers and, possibly, new
technology. Accurate estimation just isn’t
possible.
We need to accept, not deny, the
uncertainty inherent in projects and focus
on what matters to customers: usable
business value.
Lean thinking
It turns out that someone has been here
before. At the end of The Second World
‘What proportion of IT-based projects deliver what
they say they would, on time and on budget?’
So when it comes to projects, we are
inclined to believe that we are smarter,
more experienced and more determined
than others and that our projects will
be better run and more successful. We
assume that we can defy the statistics
through force of will.
Redefining success
I believe that we need a mind-set shift,
away from those false assumptions, away
from the meaningless mantra of on-timeon-budget-on-specification and towards a
goal of maximising the delivery of usable
War, Japanese car manufacturers wanted
to compete on the global stage but lacked
the resources to do so. Lead by Toyota,
they literally reinvented the manufacturing
process by making value king.
The story is told in the book The Machine
That Changed The World4. The MIT and
Cardiff academics who wrote that book
went on to write a further book, entitled
Lean Thinking5. In it they defined ‘five
principles of lean thinking’ that they argued
could be applied to any type of work:
1. Identify customers and specify value;
2. Identify and map the value stream;
3. Create flow by eliminating waste;
4. Respond to customer pull;
5. Pursue perfection.
Lean project management
The television programme ‘Grand Designs’,
usually starts with a couple who want to
build their dream home, vowing to ‘be in by
Christmas’.
Two-thirds of the way through the
programme, we cut to a snow-covered
building site in December. The disconsolate
couple have run out of money and are
living, with their family, in a mobile home
adjacent to the site.
An alternative approach would have
been to ask the builder to deliver the
overall project in a series of chunks of
usable value.
For, example, as many mobile home
owners will know, a fully-plumbed toilet
has high-value. The couple could have
asked their builder to deliver a working
toilet, as a matter of priority, before
anything else.
This might be followed by a wash-basin.
Then a shower room. Then a kitchen. Then
a bedroom and so on. The total usable
value could have been delivered in chunks.
The builder might argue that this
approach would make the overall project
more expensive. But the cost and duration
are a guess anyway! Wouldn’t it be better
to exhaust the budget on four-bedroomed
home with a roof, rather than an eightbedroomed home without one?
2015 DIGITAL LEADERS
81
PROJECT MANAGEMENT
2. Identify and map the value stream
We could interpret ‘value stream’ as the
development life cycle. This is well-defined
for most IT-based projects, but I will not
argue against the merit of critiquing the
value-adding role of the various steps.
However, I find it much more useful
to think of the ‘value stream’ as being
a regular stream of usable value that
82
DIGITAL LEADERS 2015
combines to deliver a project vision. In the
‘Grand Designs’ example, the value stream
might be:
•
•
•
•
•
•
•
orking toilet;
w
wash-basin;
shower room;
kitchen;
bedroom 1;
bathroom 2;
bedroom 3… and so on.
Note that the value stream needs to be
understood before the design of the
solution and built into that design. Ikea
don’t design furniture and then figure out
how to flat-pack it. Furniture is designed to
be flat-packed.
3. Create flow by eliminating waste
According to The Standish Group, ‘only
about 20 per cent of features and
functions specified ever get used1’. If
you are sceptical of such a low number,
consider the McKinsey report that said
that less than half of predicted value is
delivered. A lot of that missing value would
have cost money to specify. Either way you
look at it, it’s a lot of waste. Considerable
effort is being spent writing, reviewing,
designing, testing and implementing
requirements that are never used.
Toyota found that waste was created
by production line bottlenecks. One stage
of the production process produced more
than the next stage could use. ‘Work-inprogress’ piled up, tying up cash. They
found that eliminating bottlenecks created
a smooth ‘just-in-time flow’ that saved
considerable cost.
In the Grand Designs example, there is
no point having all of the materials paid
for, delivered and on site months before
they are used. And for an IT project, there
is no point having a ‘pile’ of software that
cannot be tested. Nor is there any point
producing requirements that never get
designed, or designs that never get built.
4. Respond to customer pull
Pull is tightly coupled to the principle of
flow. Pull means only doing the work
Image: iStock/180521689
1. Identify customers and specify value
The starting point for a lean approach is
to think in terms of usable business value,
not a long list of features and functions or
‘requirements’.
And to take a value-based approach,
we need to understand how the different
customers perceive value. Value is the ratio
of benefit to cost and benefit is in the eye
of a beholder.
Contrast Sekonda and Rolex watches.
If the benefit is the ability to tell the time,
the Sekonda offers the highest value. If,
however, the benefit is the ability to display
wealth as a proxy for success then the
Rolex wins.
necessary to deliver a particular chunk of
value and nothing else. The overall capacity of
the development ‘pipe’ is configured
backwards, from the organisation’s capacity
to implement value-creating changes.
Note that capacity is not only governed
by how fast stuff can be delivered. It
is also governed by the need to check
whether we got the value we expected
and make adjustments or changes as
necessary.
A side benefit of a ‘value-pull’ approach
is that the final product is much more
likely to create a satisfying customer
experience than one based on ‘featurepush’. Think about the difference between
•
•
•
•
the learning cycle has a long
duration - similar situations don’t
repeat quickly;
each project tackles a different
challenge;
each project tends to assemble a
different team;
each project has different customers.
Taking a lean approach, however, is like
running lots of mini-projects. Learning
cycle times are shorter, it’s the same overall
challenge, with the same project team,
working with the same customer.
Each time value is delivered there is an
In the Grand Designs example, there is no point
having all of the materials paid for, delivered
and on site months before they are used.
the iPod and the feature-rich MP3
players that preceded it. The iPod was
underwritten by the design maxim: ‘What
do we want people to feel?’ That’s very
value-pull.
opportunity to evaluate:
6. Pursue perfection
Continuous improvement is difficult with
projects because:
If we continue to define success in terms
of meeting a specification, on time and on
budget, our projects will continue to fall
•
•
•
v alue and quality versus expectation;
cost and duration versus forecast;
the relationship with the supplier.
victim to our optimism bias and
complacency about value delivered.
Instead, we need to redefine success in
terms of delivering usable business value
and adopt the tools of ‘lean’ to help make
the leap that manufacturing took a very
long time ago.
References
1. Chaos Manifesto 2013 – published by
The Standish Group
www.standishgroup.com
2. A study in project failure – published
by BCS at:
www.bcs.org/content/ConWebDoc/19584
and viewed on 6 Feb 2015
3. Delivering large-scale IT projects on
time, on budget, and on value – McKinsey
Insight & Publications at www.mckinsey.
com/insights/business_technology/delivering_large-scale_it_projects_on_time_on_
budget_and_on_value and viewed on 6 Feb
2015
4. Womack, D., Jones, D. and Roos, D., The
Machine That Changed the World. London:
Simon & Schuster, 1990.
5. Womack, D., and Jones, D. , Lean
Thinking. London: Simon & Schuster, 1996.
2015 DIGITAL LEADERS
83
PROJECT MANAGEMENT
be involved and how it is to be done.
It is clear that managing a project that has
specific goal(s) is quite different from day-today operations management. The increasing
rate of change and global competition
means that ‘steady state’ effectively no
longer exists for many businesses. Thus
managers are increasingly operating
through a series of strategic thrusts, with
each sub-project as part of an evolving
probabilistic business strategy.
Every project begins with an essentially
human goal, which to be realised has to
go through a formal engineered processa technical system. The outcomes of
the process are then re-translated back
into the real world of people, ideas and
actions- a social and cultural system.
With relatively simple transactional
processes the techniques of business
and systems analysis are well developed.
In the 20th century the business
requirements were usually readily defined
and relatively low in complexity, and the IT
solutions were equally relatively simple.
WE ARE ALL
PROJECT MANAGERS
Stephen A. Roberts and Bruce Laurie, both from the School of Computing and Technology at the
University of West London, immerse themselves in the sometimes murky world of project management.
Project management has become a universal
discipline. The fundamental knowledge of
the principles and practices has become
embodied in a mature subject of study.
It is widely taught and available to
those who need a qualification to practice
and it is supported by professional
qualifications such as PRINCE2 or the
Project Management Institute. The irony
is that, despite this trend, projects still,
84
DIGITAL LEADERS 2015
all too frequently, overrun in time and
and/ or cost and often fail to live up to
expectations.
Simply stated, project management is
concerned with the fundamental question
of how to manage effectively within
constraints: at the primary level of time,
financial and physical resources. These
factors are in finite supply and therefore
careful planning, analysis, requirements
specification, choice and decision-making
is required to identify, design and execute
courses of action. Project management
has developed very effective techniques
and tools of practical application in order
to facilitate the achievement of goals and
outcomes under constraint. But in order to
be fully effective, project management has
to be grounded in a strategic context that
identifies what has to be done, who is to
operational resilience in the management
of project processes.
The resolution of external conditions
is the key to creating internal resilience.
Data gathering, intelligence analysis,
information and knowledge management
practices tailored to needs are vital.
And perhaps most importantly the
two-way communication strategies not
just to manage flows but also to build
communities of practice and a sense of
polity (common consent and grounded
understanding) around which the ‘soft’
processes can be managed and local
ownership of implementation achieved.
Project ownership, not only within the
project team but at every level of the
business that will be affected, is vital.
Prototyping and agile development
approaches have enabled us to evolve
an understanding of requirements and
the resultant solutions, but the traditional
tools and techniques have become less
useful when the business requirements
are complex and subject to change. The
The increasing rate of change and global
competition means that ‘steady state’ effectively
no longer exists for many businesses.
The scale of projects is a major
factor in determining outcome. Where
requirements are fully understood,
the level of innovation is limited, the
environment and circumstances are
known, and organisational and stakeholder
settings are largely stable, successful
delivery on outcomes, time and budget
may follow.
However, this scenario is increasingly
less likely as scale and dynamic
environments generate uncertainty and
instability: hence the importance of initial
appraisals (SWOT, PESTEL) and then more
searching and ongoing critical strategic
analysis. Investment of effort in appraisal,
planning, modelling, risk analysis and
sustainability are likely to encourage
higher levels of return and greater
resultant system is often a hybrid humancomputer system; e.g. I want an ICT system
to help my doctor diagnose, not turn him
into a robot.
We have been used to dealing with
project delivery risk; indeed Barry Boehm’s
1986 spiral model had risk management
at its core. We now have to deal with
concurrent business risk: the business
does not stand still while a long-term ICT
strategy is rolled out.
The socio-technical reality is brought
into ever sharper focus and there are evercloser relationships between customers,
stakeholders, managements, project
clients and project teams and all those
who have an interest in infrastructure
and process. Behaviour, interaction,
communication, context, requirements and
general dynamics dominate both project
and business environments. Together
these create a ‘knowledge space’, which
becomes an essential part of the context
and content of project management.
Perhaps the best aspect of PRINCE2 is the
‘handshake’ between the business that will
deliver the benefits and the IT function that
will enable them.
In this era of concurrent business and
ICT risk, it is important that both ICT and
the business take a manageable step out
of the known risk areas together to explore
the uncertainties and gain mutual learning
and understanding. In this flux ‘we are all
project managers now’.
Traditional project management
has emphasised hard skills, tools and
techniques. These are still needed but
must be supplemented by organisational
development, social and cultural
awareness skills. The predominant driver
for ICT projects up to now has been
organisational efficiency: fewer clerks
or more sales for the same workforce.
It is now faced with projects to support
greater organisational effectiveness, but
in an environment of almost institutional
change. Not only must a much wider
view of risk be taken, but systems must
be developed more incrementally and
adjusted as the organisation responds
to global competition. Project managers
must mediate the business needs with the
ICT delivery and be far more aware of the
business and organisational dimensions.
For this to be a reality, the strategic
alignment of business strategy and an
enabling ICT architecture to encompass
strategic possibilities is vital and the
implications and limitations of that
architecture (the boundaries of the pitch we
are playing on) must be fully understood by
the business.
In this environment the business case
will be regularly revisited; stage reviews
will be as much about the current business
understanding as the ICT delivery, and
frequent implementations will be used as
vehicles for real mutual learning.
2015 DIGITAL LEADERS
85
PROJECT MANAGEMENT
LINES OF DEFENCE
The ‘three lines of defence’ model ensures adherence to a framework, but can it assure successful project
delivery? Peter Mayer, Managing Partner at Pelicam Project Assurance, investigates how we can eliminate
‘defence killers’, whilst going beyond the model to attack root-causes and achieve programme success.
The corporate world inevitably takes risks
to meet shareholder demands. Over recent
years, these risks have been widely promoted
and the ‘three lines of defence’ concept
was introduced by FERMA and ECII to
promote corporate transparency, enhance
communications, and ensure clear roles
and consistent boundaries of accountability.
Early adopters of the model have seen
improvements, including:
•
•
•
•
•
management.
cohesive management;
stronger risk culture;
better definition and communication of
risk appetite;
clearer ownership and drive towards
risk resolution;
objective reporting of risk into senior
But can we use the ‘three lines of defence’
as a risk framework for projects and
programmes?
Rules of Engagement: Applying the Three
Lines of Defence to projects
Board of directors / governing body / audit committee
Senior management / risk committee
Key activites
Outcomes
86
DIGITAL LEADERS 2015
Operational management
1st line
Risk management
2nd line
Internal audit
3rd line
Implement governance, risk
and control frameworks
Design governance, risk and
control framework
Review framework
application objectivity
Measure and manage
project performance
Monitor adherence to
framework
Offer independent oversight
of 1st and 2nd lines
Manage risk (within agreed
risk appetite)
Provide timely, balanced
information
Control risks
Confirmation of control
Strategic overview of
controls
1st Line
Project teams work at the coalface (typically
with project office support embedded) to a
management delivery framework
(policies, procedures, processes), with a
focus on risks being identified, managed
and reported.
2nd Line
Typically, intelligent project office, risk
management office or governance teams
oversee programmes or sections of the
portfolio. Teams must design a fit-for-
to allow your teams to lay down a strong
foundation of risk culture.
To identify and prevent ‘defence killers’,
watch out for the following:
Passing accountability: First line teams
confuse their responsibilities with second
and third lines and relinquish control.
Eliminate: Embed the Three Lines of
Defence model in your delivery
framework. Understand who needs to do
what. Establish defined roles with clear
Supporting your test team to shift their focus
from solely finding defects to predicting quality
in your projects will make a real difference to
how the team is valued.
purpose framework to ensure that
identified risks are consistently controlled
and reported to senior management.
3rd Line
Internal audit must offer independent
oversight across the organisation, whilst
monitoring the adherence of first and
second lines, to confirm that the control
framework is in place and working.
However, we have also seen a number
of weaknesses in its implementation…
Engage with the enemy:
Eliminate ‘defence killers’
‘Defence killers’, such as ‘passing
accountability’ and ‘unwanted overhead’
loiter in the first, second and third lines.
They must be engaged with and eradicated
accountability: communicate, syndicate and
measure.
Unwanted overhead: First line teams view
second and third lines as imposing a burden
on them, with no added value.
Eliminate: Recognise the limitations of
your delivery framework. Monitor first line
adherence to the framework, but do it with
a ‘light touch’. Ensure key learnings are
proactively shared and understood, not lost.
Politics: Risks are not called out for fear of
upsetting people, with recriminations and
career limitations.
Eliminate: Focus on unambiguous, objective
reporting to senior management, including
progress against time, quality, scope, risk
and benefit targets.
Going native: Assurance resources are
‘embedded’ in a project team and lose their
independence.
Eliminate: Keep the teams separate.
Assurance will inevitably integrate
themselves within the project team when
working closely together. Ensure all
assurance maintains objectivity and
independence.
Bayoneting the wounded:
Focusing on how things went wrong this
time, rather than how things could be
improved next time creates unhelpful pain
and provides little value to the project.
Eliminate: It’s easy to point out problems
after the event, but it’s not always useful.
Instead, demonstrate positive outcomes
from lessons learnt activity.
Identify the ‘landmines’: unearth risks
and attack root-causes
The project landscape is like a minefield.
Employing the ‘three lines of defence’ helps
assign ownership, accountability and
governance to risks that have been
identified. But what about the risks that
remain below the ground? They are a
series of ‘landmines’ waiting to explode.
Therefore, we must do more to gain
visibility and detect, dig up and defuse
these ‘landmines’, such as ‘no holistic view’
and ‘ignore root cause’, before it’s too late:
No holistic view: Project roles are assigned
too low level (work packages) and few
2015 DIGITAL LEADERS
87
PROJECT MANAGEMENT
Board of directors / governing body / audit committee
Senior management / risk committee
Enhanced key activites
people have visibility of the bigger picture.
Defuse: Build team aspirations and capability.
Encourage a ‘winning mentality’. Empower
people to take action and make decisions.
Ensure experienced, key people maintain
visibility across the project and its
surroundings, ensuring it remains viable.
Ignore root cause: Teams focus on impact,
probability and mitigation. Root causes are
not considered.
Defuse: Establish independent assurance
as part of the DNA within the project culture.
Get under the skin of the project. This is
not a ‘tick and turn’ exercise. Move the
mind-set from ‘plan to deliver an audit’, to
‘deliver an outcome of real value’.
88
DIGITAL LEADERS 2015
Process not principle: Teams concentrate
on compliance to a faulty framework and
don’t exercise judgement. They focus on
compliance and regulatory standards,
ignoring do-ability.
Defuse: Work with the delivery
framework, but recognise it’s not the
answer. Encourage its development (and
simplification). Develop a risk-based
approach, using principles as well as
procedures.
Losing: independence/ challenge:
Inexperience means we accept the project
team’s word and are unable to challenge.
Defuse: Build relationships founded on
credibility. Ensure reviews are independent,
fact-based and collaborative, with clear
sights on do-ability. The goal is to fix the
project, not report it as broken. Assurance
should be engaged early on in complex
projects, calling-out risks before they hit.
Winning the battle: steps to project success
The ‘three lines of defence’ model merits
our attention; however, simply adhering to a
delivery framework will not buy you project
victory and participants at all levels are in
danger of being seen to fail their duties.
The model alone cannot uproot all risks,
nor protect you from the weaknesses
that surface during implementation,
manifesting themselves as ‘defence
killers’.
This often generates blame, denial and
excuses, meaning we must reinforce and
1st line
2nd line
3rd line
1. Use the delivery
framework to establish
cohesive roles and
responsibilities
1. Monitor adherence to the
delivery framework and
continue to develop and
simplify
1. Give independent
oversight of 1st and 2nd
lines effectiveness
2. Build team aspirations
and capability; develop a
‘winning mentality
2. Report progress against
key measurement targets
2. Offer independent
objective, assessment.
Provide real insight into the
do-ability of the project
3. Manage risk, within an
agreed risk appetite
3. Ensure key learnings are
learnt (not lost) and are
pro-actively shared
3. Prioritise 3rd line activity
independently with early
engagement on complex
projects.
4. Maintain a holistic view
and ensure ccontinued
viability
Establish independent
assurance as part of the
DNA within the project
culture
4. Ensure reviews
are collaborative and
develop credible, trusted
relationships
Control risks
Confirmation of control
Strategic overview of
controls
New outcomes
optimise the model to avoid project failure:
rise.
Follow this optimised model collectively
and create an action-plan that is owned,
visible to senior management and acted upon.
This will allow you to take a proactive
route of attack. It will significantly improve
your prospects of eliminating ‘defence
killers’ – ensuring project and programme
risk governance across the organisation
– as well as identifying unearthed hidden
risks - or ‘landmines’.
Do this, and you will be rewarded, as
project and programme delivery rates will
References
1 Federation of European Risk
Management Associations & European
Confederation of Institutes of Internal
Auditing
Based on Pelicam research and discussion
at Pelicam’s ‘Intelligent Projects Forum’
(December 2014). For more information,
please visit www.pelicam.com
2015 DIGITAL LEADERS
89
PROJECT MANAGEMENT
leadership models:
1. hierarchical and functional, where
demands are executed without being
questioned;
2. democratic and explicative, where
demands stem from discussions and
considering all opinions.
POLITICS KILLS THE
PROJECT STAR
Sixty per cent of projects fail due to poor communication, scope creep, lacking risk control, cost
miscalculations, loss of strategic interest or sabotage. With this in mind Dr Stefan Bertschi and Gandin
Hornemann, Consultants at Pcubed Program Planning Professionals, discuss how stakeholder sabotage
can be identified and lead to recovery.
The two types of sabotage are:
1. Passive sabotage as unexplained,
hidden expression;
2. Active sabotage as explicable, open
objection.
The former seems to have political and
tactical connotation; it leaves the project
powerless and in lack of response, unable
to apply standard methodology to recover.
The latter can be addressed and managed
from within the project.
Passive sabotage is bad because it
subtly undermines endorsed project
goals; it robs effectiveness as it cannot
be managed directly, and is a ‘time thief’,
leading to significant project delays. The
stakeholder is apparently involved, but
without real interest in project progress
and with commitment as lip service.
Active sabotage is defined as having a
hidden agenda with noticeable explanation,
then again passive sabotage as having an
unwitting agenda. Differentiating it from
90
DIGITAL LEADERS 2015
passive resistance, stakeholders are
seemingly ‘unknowing’ of their sabotage
and damage caused (or simply do not
care).
In a typical scenario, an IT delivery
project has its technical foundation
defined and agreed. A complex solution
comprising of multiple systems is
endorsed by senior management. A few
months into the project, an alternative
solution is suddenly proposed by the IT
supplier. This is presented as needed
systems comparison, not as scope change
request, and since it does not trigger
control, it delays the project.
No explanation for this late and
sudden demand is ever provided by the
stakeholder, and middle management, not
wanting conflict on the project, can only
recognise its changing status. The project
cannot turn to positive resolution and is
paralysed. Discussion is not possible as
the whole project is overshadowed by
politics that cannot be addressed.
The problem encountered here we
The predominant model in an organisation
defines how projects are being run and
how delegation takes place from strategic
senior level to delivering team level:
1. In a hierarchical and functional model,
the culture involves strict
management with team members
instructed to accept and follow orders
passively.
2. In a democratic and explicative model
the culture revolves around loose
management with team members
invited to express opinions and object
proactively.
Recognising passive sabotage early
is critical for recovery and requires
monitoring, with alerts on deviations from
project commitments (be it altered course
of action, sense of project responsibility
or acceptance, or failure to work towards
endorsed goals).
Our response to passive sabotage is a
step approach:
A. ‘Activate’ passive sabotage.
B. If the situation is a complete mess,
consider (external) mediation.
C. Replace stakeholders.
D. Remove stakeholders without
replacement, if possible.
E. As ultimate wake-up call, threaten to
close the project.
Activating passive sabotage is about
(re-)educating stakeholders by unlocking
communication whose absence prevented
assessing the reasons for their sabotage.
Activation is not a risk response, it is a
change response; it does not address
call ‘passive sabotage’, with a major
stakeholder not working towards agreed
goals. In this scenario, traditional project
governance and its escalation process
have proven unproductive.
Transformation (at project team level) requires
transition (at management level), and this
altered state must be recognised.
Project assassination
Reasons for project failure can be tied to a
more fundamental cause, which is the
project environment, especially where a
weak project culture exists within
organisations.
Historically grown structures and
emergent processes undermine progress
and openness for a culture that supports the
success of projects through trial and error.
The authority of a project and its
success depend largely on the existing
culture. The leadership model, as the
‘heart of politics’, determines the project
culture. Our experience in executing
challenging IT delivery and business
change projects shows its crucial
importance.
There are predominantly two main
If any part of the project does not work
towards agreed goals (even if unwilling
delivery had to be ensured), there is
foundation for sabotage. If demands are
managed wrongly and leadership is not
supporting an open project culture, there
is higher probability for passive sabotage.
The stakeholder’s ‘unknowing’ is its
defining feature.
Response by activating
The hard news first: passive sabotage
cannot be prevented and any working
escalation process may help or have the
opposite effect altogether.
Being passive, it will show effects but
cannot easily be detected. Since traditional
project methodology is insufficient, it is
difficult to find a way forward.
scope or any other controllable change,
it addresses behaviour. Negative conflict
and mistrust has to be turned into positive
conflict and feedback.
To facilitate this, the right authority has
to be given to the project. The resolution
of how to activate passive sabotage has
to be sought by the project manager; this
requires a leader in control of seemingly
uncontrollable situations. It is an ungrateful
job to seek cleansing discussion with those
sabotaging and then working through
above step approach. Most promising
is the attempt to show consequences of
‘how’ (direction) against ‘what’ (goal) of
competencies against strategic vision.
If passive sabotage can be activated,
then it can be managed and response may
be in diplomacy, de-personalising and
de-politicising. If it cannot be activated, the
killing is about to start.
To change or to be killed
The secret of tackling passive sabotage
lies in engaging stricter leadership as
necessary, if only temporarily. The aim
is in growing the right culture, its goal to
kill the foundation for passive sabotage
before the project is being assassinated.
Nevertheless, to grow lasting mentality and
culture is not usually part of standard
project management methodology.
From change management we learn that
the stakeholder is at the heart of behavioural
change. Understanding its role as part of a
strengthened project delivery is the first step.
Transformation (at project team level)
requires transition (at management level),
and this altered state must be recognised.
In any case, addressing individual
behaviour is just a quick fix, allowing us
to embrace change in one situation only.
Notably in IT delivery projects, aspects of
organisation, culture and people need to
be considered comprehensively to ensure
the group is proceeding towards business
goals and success.
For a sustainable response, change
should ideally lead to an open but strong
project culture. Having such a culture in
place minimises the appearance of passive
sabotage. This requires change management
to become part of project management.
In light of managing tensions around
politics, the project manager must find the
root cause, and actively manage change
and engagement. Winning hearts and minds
is much more than an esoteric phrase.
Only the combined package, scaled to
the appropriate level, tackles stakeholder
sabotage and leads to recovery.
As a last resort, the star to be killed
is not a person. Since people lead and
deliver projects, we have merely discussed
the importance of managing behavioural
change whilst other change is taking place,
and of getting the right project culture in
place before it has gone too far.
2015 DIGITAL LEADERS
91
SECURITY
A NEW HOPE
Image: iStockPhoto
With a Fortune 500 CEO ousted and a Hollywood movie held hostage, cyber-security is very much on the
minds of chief executives in 2015. With this in mind Guarav Banga, co-founder and CEO, Bromium, asks:
How can a massive organisation with complex systems and networks prevent itself from becoming the
next Target or Sony? Is there any hope?
Yes, there is hope! However, we have to
change the economics of cyber attacks.
Cyber-security is an economic game.
In The Art of War, Sun Tzu discusses the
economic considerations of war, front and
centre. The business of cyber-security is
also an economic game.
Cyber-crime is red-hot because it makes
great economic sense to the adversary.
The investment of time and money
required for cyber criminals to breach a
billion dollar organisation is infinitesimally
small compared to the payoff. A team of
two or three hackers working together for
a few weeks with a few thousand dollars of
black market software is often enough to
breach a Fortune 500.
This reality confounds CISOs who
already spend tens of millions of dollars
every year on IT security. Your IT security
investments are not giving you any
leverage!
Antiquated defences - vast attack surfaces
92
DIGITAL LEADERS 2015
Current security architectures were
designed in a bygone era when there was
a useful notion of an internal network
inside the corporations’ buildings, and the
internet outside. The firewall was invented
to create a narrow isolating choke point
between internal networks and the
internet allowing only a few controlled
interactions. All was well!
In today’s world of mobile, social and
cloud, the situation is quite different. Your
systems routinely run computer programs
written by persons unknown. While you
may not realise it, each internet web page
is a computer program, as is every email
attachment, and even web advertisements.
Just about any internet-connected
‘rectangle’ that you see on an electronic
screen is a program. All these external
programs are potentially malicious, and
can compromise you.
A single bug in over eighty million lines
of computer software, in Windows or Mac
OS, or in any app, e.g., Office, Java, Adobe,
combined with an inevitable
miss-click by an unsuspecting employee
can compromise your enterprise. You
have a massive attack surface, literally
countless places for the bad guys to get in!
The endpoint is your unguarded front door,
where you are being attacked continuously
as your employees click away in offices,
homes, coffee shops and hotel rooms.
The endpoint is the weakest economic
link in your defences. Once an endpoint is
compromised, the adversary can remotely
control the infected computer with the
same privileges on your network as one of
your legitimate users.
Backfire from security investments
Let’s consider the economics of the nextgeneration firewall. First, the next-gen
firewall does absolutely nothing for your
riskiest mobile users.
Moreover, modern malware tries hard
to avoid misbehaving while it is still within
your network pipes before reaching
an endpoint. The firewall, grasping at
straws, generates a large daily stream of
seemingly suspicious events.
These notifications have to be
analysed and chased down by additional
investments in event management
systems, and security analysts. The
overwhelming majority of these events
turn out to be false positives, i.e., wasted
money.
The bad guys also use this as a weapon,
by cranking up the volume of spurious
traffic known to generate false positives,
while the real attack is carried out
elsewhere. This is reverse leverage.
Ultimately, the next-gen firewall
becomes a bottleneck, a choke point,
unable to keep up with your growing
traffic. You have to spend more money on
unknown internet programs that run on
your endpoints, while still supporting
such programs seamlessly. This isolation
technology must be simple and robust,
like disposable gloves in a hospital. It
must be designed such that it costs the
adversary significant time and money to
try to break through. Ideally, you must also
be able to fool the adversary into thinking
that they have succeeded, while gathering
intelligence about the nature of the attack.
Techniques like micro-virtualisation let you
do this.
You will also need new products that
let you continuously visualise and monitor
your risk at the internet endpoint level, and
provide end-to-end encryption and robust
identity authentication. Your compliance,
device management and insider-threat
Nothing worth doing is ever easy, and you
must be prepared to see this through. The risk
of inaction is worse.
additional hardware that generates even
more false-positive events; a vicious cycle.
monitoring systems must also work within
this framework.
A new hope
There is hope. Innovation will resolve this
crisis. You cannot afford to keep doing
more of what you have done in the past, or
more incremental versions of this stuff.
You have to look beyond Security
1.0. In order to level the playing field,
organisations must invest in a strategy
that will directly impact the economic
costs to malicious actors.
You are looking for products that
reduce your exposure. Your investments
must protect your information from
Plan ahead or fall behind
A very senior executive, i.e., you, Mr. CEO,
is going to have to micro-manage the plan
to mitigate the risk of cyber-attacks. This
is a time of great risk to our organisations,
so leaders must follow their own business
instincts.
How will you figure out the products
that will make up your new security
architecture? This is quite straightforward,
just ask Marc Andreessen, the venture
capitalist, or Phil Venables of Goldman
Sachs for a list of 5-10 start-up companies
with innovative Security 2.0 products.
Ignore any company that is not run by
its founders. You must partner with
people with long-term goals towards
your economic victory against the cyberadversary, and who are thinking beyond
just a quick transaction.
Ask the start-up leaders to come and
pitch their solutions to you personally.
Have them convince you of the efficacy
of their approach. If you don’t understand
what is being said, or if you don’t see
how the proposed solution raises the
economic costs to the adversary by orders
of magnitude, it is not worth your while.
Select what you truly believe in, and then
help the start-ups help you!
Unless you have one already, hire a topnotch CISO as a partner for this project.
For suggestions on whom to hire, ask any
one of Jim Routh (Aetna), Tim Dawson (JP
Morgan Chase), Roland Cloutier (ADP), John
Zepper (US Department of Energy), Tim
McKnight (GE), Sunil Seshadri (VISA), Mark
Morrison (State Street) or Bob Bigman
(former CISO of the CIA).
These are some of the modern-day
Knights of the Round Table in the realm
of cyber-security, and understand the
economic principles underlying this fight.
While you transform your security
infrastructure to turn the economic odds
back against the adversary, your company
might look like an ‘under construction’
zone. Some users will complain loudly,
and you will have to make an effort to have
the business running smoothly while the
transformation is in play.
Nothing worth doing is ever easy, and
you must be prepared to see this through.
The risk of inaction is worse.
2015 DIGITAL LEADERS
93
SECURITY
Image: iStockPhoto
be detected and confronted with 100 per
cent reliability. But there are certain steps
that can help organisations to create an
acceptable level of defense against SEAs.
CISOs should take a leading role in dealing
with such attacks.
Whilst the list that follows is not
comprehensive, it presents ten key things
that organisations must do to increase
their defensive performance against SEAs.
ANTI-SOCIAL MEDIA
Reza Alavi MBCS, an information security consultant, reveals ten key methods of helping organisations to
combat social engineering attacks (SEAs).
Protecting and preserving information
assets has expanded from merely focusing
on technological alignment into a crusade
of understanding and defining human
factors and roles.
Social engineering attacks (SEAs) are
among the most pervasive security threats
to organisations and are ever increasing
both in quantity and quality.
SEAs such as advanced persistent
attacks (APAs), reverse social engineering
(RSE) and phishing all exploit the
greatest and very real weakness of the
human elements in information security
94
DIGITAL LEADERS 2015
systems. Despite a growing awareness of
information security threats, SEA incidents
and breaches are in steady rise according
to the Data Breach Investigations Report
(DBIR) by Verizon Enterprise (2014)1.
These incidents have led to billions
of pounds in losses, legal fines and most
importantly, reputational damages. The
most striking question is: why, despite the
many technological automated security
countermeasures that are compulsory
for people who work in organisations, do
security incidents still keep happening and
solutions cannot be found.
SEAs often concentrate on an
individual’s perceived needs, such as
greed or the need to be acknowledged
and understood. Attackers attempt
establishing personal relationships and
trust with their targets and use legitimate
and authenticated communication
channels. And today the majority of
attackers are professional criminals who
easily penetrate the security systems by
using individuals’ weaknesses against
them. It is therefore extremely difficult for
organisations to deal with and counter the
threats of SEAs. There is no way SEAs can
1. Acknowledge the individual merit and
service provided by each and every
member of the organisation. Staff
should feel that they are
valued employees and their service is
treasured. Most organisations
forget to acknowledge the service their
employees provide. Organisations
survive because they have these
3. Think like a social engineer and train
your employees in the same way.
Train them to question things, and
think in a security conscious way.
They should learn how to approach
other individuals with an aim to obtain
pieces of information in the same way
that a social engineer would.
4. Re-structure the focus of your
organisation’s security culture, to
prioritise SEAs as the main threat to
information assets. Security culture
can be re-shaped and defined as long
as information security objectives
have been defined and set out clearly.
5. SEAs should be on top executive’s
agendas, as all evidence supports the
substantive and extensive financial
losses that can be incurred through
SEAs. Management support and
Social engineering attacks (SEAs) are among
the most pervasive security threats to
organisations and are ever increasing both in
quantity and quality.
individuals working for them.
2. Establish a two-way trust relationship
with your employees. This can be
achieved with various training
sessions. Employees need to know
and understand the risks they face
themselves. They must be told about
the organisational goals and objectives
as well as major security threats
and potential losses that exist if the
system is breached. They need to
understand that they are a major part
of that system.
placing information security within the
life cycle of wider risk management is
essential to the continuity of
business. It is important that the whole
of the executive management team
are included in security awareness
programmes and that their skills and
knowledge of SEAs are updated.
6. Enhance communication skills and
understanding of the importance of
adequate and effective communication
channels. Communication can be
perceived differently by different
7.
8.
9.
10.
people and therefore, it is essential
that employees have an understanding
of how communication can be
manipulated by social engineers.
Confront elicitation techniques used by
attackers to deceive individuals. It is
important that this aspect of SEAs is
added to the awareness training
programmes in order to further
educate employees.
Provide a consistent approach to SEAs
and let employees exercise their
consistent commitments towards
security objectives.
Provide both financial and
non-financial incentives to motivate
your employees to adopt a consistent
and effective approach to information
security issues. The same incentives
are in the minds of attackers when
they plan their attacks.
Convert social engineer tactics to
useful and positive tools for confronting
future attacks. Use them to update
training materials and awareness levels.
SEAs are a clear and present danger to the
very existence of information assets. No
firm will be able to fully confront this issue
without the full engagement of employees
and without instilling in them the awareness
and trust they deserve to fight against such
attacks.
References
1Verizon Enterprise Solutions. 2014. 2014
Data Breach Investigations Report (DBIR).
Available:
www.verizonenterprise.com/DBIR/2014
2015 DIGITAL LEADERS
95
SECURITY
CHANGING SPOTS
Image: iStockPhoto
Elke Bachler, Deputy Director of Security & Information at HM Revenue & Customs, talks about the
changing face of government security.
Government security has a fearsome
reputation by tradition – just think of the
Manual of Protective Security, ISO27001,
IS1/2, onerous and labyrinthine assurance
requirements and the confusing shadowy
presence of CESG and GCHQ in the
background.
All of this made it feel like an episode
from ‘Yes Minister’ crossed with ‘The
Da Vinci Code’. Not surprisingly, many
companies decided that it was just too
hard to work with government. SMEs,
in particular, found it almost impossible
to win government work, unless they
were able to subcontract through large
suppliers who could afford the overhead
of engaging with security. But although it
might not always quite feel like it, things
are definitely changing.
Digital agenda
One of the key drivers for change is the
call to deliver services that meet customer
needs. Digital service delivery promises
better and more responsive services,
much greater speed and efficiency, and
lower cost. The challenge is how to enable
innovation and quick delivery securely.
96
DIGITAL LEADERS 2015
The proliferation of digital services both
increases the attack surface and the size
of the ‘prize’ an attacker may hope to gain
if successful. Networks are becoming
increasingly complex, with multiple access
points, management platforms and attack
vectors, making it more difficult to monitor
and manage security.
High-profile incidents have brought
the security issue to the attention of
the general public and led to a wide
debate on balancing the need to protect
with the need to enable. Building and
maintaining citizens’ trust in the security
of government digital services is a
critical factor in being able to exploit the
opportunities on offer.
Government Protective Marking Scheme
The most prominent change must
surely be the demise of the Government
Protective Marking Scheme. Earlier this
year, the old scheme with its six
classification levels was replaced by the
much simpler and clearer Government
Security Classifications with only three
classifications – OFFICIAL, SECRET and
TOP SECRET; and the vast majority of
information will be at OFFICIAL level1.
The need to protect information has not
changed, but an overly complex and highly
directive system developed for paper
processes has been replaced by a simpler
and easier to understand model, more
appropriate for modern working practices.
Moving to the new scheme has not been
without its challenges. There is no direct
mapping between the old and the new
model; the handy shortcut of ‘IL3 means
RESTRICTED, RESTRICTED means IL3’ no
longer works; and processes and systems
take longer to change than policy.
There is also a culture change that
still needs to happen - seasoned security
professionals have been overheard
muttering ‘…there will always be a
CONFIDENTIAL…’ –, and risk aversion
leads many to over mark their information
‘just in case’. But it is a critical step that
sends an important message, namely that
security policy’s role is to provide guidance
and direction, so that staff and suppliers
can make sensible and appropriate
decisions based on informed judgement.
It also sends another important
message to industry and potential
suppliers, namely that government’s doors
are open for business. Security, too often
perceived as a ‘keep out’ sign, has resulted
in stopping business from happening
rather than making it happen securely.
In time, new classifications will make
it simpler for government departments
to procure from a variety of suppliers
and enable suppliers of all sizes to bid for
government work. All of this will lead to more
innovative solutions, different and more
flexible commercial partnerships, better use
of commercial off-the-shelf (COTS) solutions
and more interoperability across government
departments and third parties.
Professional security
Not so long ago, a government security
professional was seen as someone who
defended the old ways of working and
entrenched government security more and
more into a ‘closed club’ where debate
was not welcome.
But here, too, a culture change is
taking place. Key to it is the recognition of
order to get security right, the people who do
it must be up to the task.
The lack of cyber skills is a particular
area of concern. Civil service cyber
apprenticeships are only one of a number
of initiatives intended to address this4.
This scheme is aimed at school leavers
with at least two A-levels and allows the
apprentices to acquire up-to-date technical
expertise, whilst also gaining critical
workplace skills
Bringing security upstream
Another noticeable change in government
security culture is driven by the speed
of digital delivery and the use of agile
methods. In the Waterfall model, security
requirements would be defined by a team
of experts at key points and then built and
tested before go live. In an agile delivery,
security needs to be considered and built
into processes as they are being developed.
Security requirements must be considered
as user stories develop, and checked
against the high-level security design so
There is a world-wide and growing shortage of
security skills; estimates are that the industry
may be short of as many as one million
security professionals worldwide.
security as a profession in government2,
providing focus for recruitment, training
and continuous personal and professional
development for all working in security in
the civil service.
This new government security profession
is about a group of people who are well
informed and knowledgeable, but whose
primary concern is making the world of
information security accessible to business.
There is a world-wide and growing
shortage of security skills; estimates are that
the industry may be short of as many as one
million security professionals worldwide3.
The establishment of the security profession
is a recognition across government of the
critical importance of security, and that in
that any security risks can be managed
alongside other risks such as cost, quality
and usability.
Agile methods lend themselves very
effectively to secure development; for
example, pair programming, peer reviews,
change tracking and ‘test-first’. This
approach means that all team members and
stakeholders understand and identify risks,
and ensures that appropriate mitigation
(through technology, process or policy)
becomes an integral part of a digital service.
Implemented well, all this helps to
bridge the traditional gap between security
and development and deliver business
code quickly and securely.
Holistic security
In many government departments,
accountability for different elements of
security is distributed across a number of
business areas; and, notably, the various
roles and responsibilities also differ from
department to department. The need to
be more responsive to a rapidly changing
threat landscape and the need to provide
flexible, appropriate and proactive
responses is driving ever closer
cooperation within departments and across
government.
Another interesting aspect of this
development is that in the past the
security community has focused mostly,
and sometimes almost exclusively, on the
confidentiality part of the CIA model. Now
integrity and availability are becoming
equally important factors in managing the
overall risk to the organisation. The focus
is on ensuring that security’s role is to
enable the government’s businesses and
service to be delivered to citizens safely
and securely.
The future
To coin a phrase, government security is on a
journey. There is still some way to go –
culture cannot be changed as quickly as
policy and standards; closing the skills gap
will take time. We have started, but there is a
lot more to do. Establishing effective, lasting
partnerships between government security
and the IT industry are on the critical path.
References
1. https://www.gov.uk/government/
uploads/system/uploads/attachment_
data/file/251480/Government-SecurityClassifications-April-2014.pdf
2. https://www.gov.uk/government/organisations/government-security-profession
3. CISCO 2014 Annual Security Report, p. 60
4. http://cybersecuritychallenge.org.uk/;
https://www.mi5.gov.uk/careers/technicalapprenticeships.aspx; http://www.e-skills.
com/news-and-events/march-2014/
employers-create-specialist-cyber-security-apprenticeships/
2015 DIGITAL LEADERS
97
SECURITY
Image: iStockPhoto
CYBER RISK REGISTER
Cyber risk advisor Shaun Cooper MBCS makes the case for organisations having a cyber risk register
and explains how to go about creating one.
Information technology and cyber security
managers face a continuous battle to help
board directors understand cyber risk.
Often, the focus is on cyber threats, and
on the measures taken to mitigate cyber
incidents or reduce their impact.
Communicating in this fashion to the
board is certainly important because
security control measures provide
practical protection, and this can help
justify the investment made in them.
However, a business can still be left
exposed if communication to the board
is too focussed on threats and mitigation
measures. Rather, to understand
what cyber risk really means to their
98
DIGITAL LEADERS 2015
organisation, and to make informed
decisions about how to reduce exposure, a
board needs a much broader framework.
A cyber risk register provides this
framework.
A cyber risk register, like any
risk register, is a table showing the
highest risks, and opportunities, to the
organisation. It provides a framework to
consider those risks and opportunities:
for example, what is the organisation’s
appetite for them, what treatment should
be applied, by whom and by when?
This makes it easier to understand what
cyber risk ‘means’ for the organisation.
It allows the board and those with
operational accountability to consider
and discuss those risks, to collaborate,
and to achieve consensus. At the very
least, a cyber risk register is a useful tool
to stimulate conversation. But its real
value lies in ensuring all significant cyber
risks are being identified, assessed and
managed.
A cyber risk register is also useful as
a means of communicating to external
stakeholders that the organisation
understands and is managing its cyber
risk. It is indicative of a positive risk
management culture and, for example,
can have a positive influence on premiums
property assets and systems assets. The
next step is to assess the threats to those
assets i.e. who or what threatens the asset,
and how? A basic list of threats is provided
by ISO27001, but any threat list needs to be
updated constantly.
The probability and impact (financial
and reputational) of each risk should
then be assessed. This can be largely
subjective. Yet if some attempt is made to
quantify them it often gives others in the
organisation a more tangible opportunity to
challenge them.
This can often lead to the development
of useful approaches to quantification -
A cyber risk register, like any risk register, is
a table showing the highest risks, and
opportunities, to the organisation.
charged by insurers. More broadly, it
is a reflection perhaps of how well an
organisation is run.
In order to comply with findings of The
Turnbull Report, public companies will no
doubt already have dedicated enterprise
risk teams responsible for developing and
maintaining a cyber risk register.
However, for the majority of
organisations, a cyber risk register is likely
to be just one of many responsibilities
allocated to heads of finance or IT. So how
can they easily create and update a cyber
risk register?
Creating a cyber risk register
The process should start by identifying the
organisation’s key assets. Assets range
from people assets, information assets,
which otherwise might never have been
envisaged.
The result is a numerical assessment of
residual risk. Risks can then be ranked and
added to the cyber risk register.
The Association of Chartered Certified
Accountants provides a good threshold for
which risks to include on a risk register:
any risk/opportunity with an impact of
more than five per cent of business profit.
An example of how a risk can then be
expressed on the cyber risk register is:
‘customer financial information being
exported onto a cloud storage service,
accidentally by an employee. The overall
impact is £2m-£2.5m.’
The first attempt to identify assets
and threats should arguably be made by
someone from the information technology
/security team. The resulting draft cyber
risk register can then be used as a
framework for brainstorm sessions with
other representatives from across the
business.
These sessions will not only help add
and refine the identified risks, they will also
help identify the control measures which
are already in place, or which should be in
place. These sessions can also help assess
whether the residual risk is likely to be
acceptable or not.
Using a cyber risk register
The cyber risk register can then be used
by the board to determine appetite for
each risk/opportunity and what treatment
options they wish to apply: whether to
accept the risk, reduce, avoid or transfer
it. Actions and accountability will result
from this process. The cyber risk register
should, of course, be updated regularly.
Irrespective of the actual decision the
board makes on how to treat each risk, the
point is that this is an informed decision. It
is also one which internal stakeholders are
aware of, and have contributed to.
As noted above, potential opportunities
should be assessed in a similar way, and
where these have the potential to add
significantly to profitability, programmes
should be considered to actively harness
these.
A cyber risk register is not a silver bullet to
cyber risk. But as a framework for educating,
collaborating and managing cyber risk it can
create significant value from the increased
confidence and informed decision-making for
senior management and the board.
2015 DIGITAL LEADERS
99
SECURITY
was used to sabotage the Iranian nuclear
programme.
Image: iStockPhoto
Why aren’t they arrested?
Cyber-crimes are quite difficult to
prosecute, because the perpetrators are wellversed in techniques to conceal their identity
and location. Individual IP addresses can be
hidden by anonymity networks like Tor3.
Sometimes the nature of the crime
also protects criminals from identification
and prosecution. For example, a botnet
network could contain thousands of
infected computers located all over the
world. Most of the time owners of the
CYBER WAR AND YOU
that they will respond to cyber-attacks
using traditional warfare if necessary quite a disturbing prospect!
How could this affect me?
If a criminal’s motivation is to steal money
or data then you could be a direct victim. I
recommend that you carefully check your
accounts and credit report on a regular
basis, and take care when logging into
websites containing your sensitive data.
However, higher-level cybercrime, like
taking control of critical infrastructure
facilities, probably doesn’t pose a direct
threat. Even if the attackers can intrude
The evidence is clear that they could, and
indeed have already participated in warfare.
Szilard Stange, Director of Product Management at OPSWAT, answers some frequently asked questions
that your organisation will want answers to regarding international cyber warfare.
infected computers don’t even know
they are infected, and it is difficult for the
authorities to differentiate these innocent
victims from the perpetrator of the crime.
into a computer in a nuclear facility, these
systems have redundant layers of physical
protection to defend the infrastructure and
to prevent catastrophic events.
The task of maintaining a secure digital
environment is far from trivial, as the
activities of criminal groups posing
imminent threat via cyber-attacks is
increasing. There are a few components
that could be causing this increase in
digital crimes.
More criminals are simply turning to
the internet instead of traditional criminal
pursuits, and many extremist groups and
totalitarian governments have managed to
build up a cadre of security professionals
and developers, boosting their presence on
the dark side of the web.
But how do these recent developments
Can they cause a real war?
According to industry speculations, it’s
quite possible. But the evidence is clear
that they could, and indeed have already
participated in warfare. We need only look
at the successful intrusions into critical
infrastructure like nuclear facilities and
breaches of military networks to begin to
understand the role cyber-warfare plays.
The US sanctioning North Korea4 over
its alleged role in the Sony Pictures hack
and North Korea’s response is one of the
best examples to date of real geo-political
consequences stemming from a cyber-attack.
Several countries have made it clear
Is my computer part of these wars?
The highest risk for you as a consumer is
being looped into a botnet. If your computer
is linked to such a network, it will not only
consume your resources (RAM, CPU,
internet bandwidth), but it may raise
law-enforcement agencies’ attention.
In the past it was enough to avoid
browsing porn sites, not download pirated
media or programs, and make sure that
your operating system and antivirus
programs were up-to-date, but just
avoiding risky behavior is no longer enough!
Everyone should be cautious as more
attackers use zero-day vulnerabilities and
100
DIGITAL LEADERS 2015
in the international cyber war affect your
daily life and what should you do about it?
Here’s everything you need to know.
What is their motivation?
There is no easy answer to this question.
Many cyber criminals are simply looking
for money and fame among their fellows;
others are motivated by ideological or
political considerations. These disparate
motivations can result in criminal acts
ranging from the formation of huge
computer networks (botnets) to drive
ransomware attacks or orchestrate
large-scale spam operations. Others seek
to steal data or disturb the operations of
critical infrastructure.
Who sponsors them?
Many cyber-crime activities are selffinanced, funded by the masses that fall
for world-wide spam, phishing and
ransomware attacks. The large-scale bank
heist1 recently reported by Kaspersky Lab
proves that malware-borne attacks can
provide a huge payout for the criminals!
Many governments fund their own cyber
armies, so to speak, focused on espionage
or even enacting physical attacks as was
seen with the famous Stuxnet worm2 that
hack well-known sites to demonstrate their
capabilities and to distribute new malware.
What can I do?
Up-to-date security tools, like all-in-one
desktop security programs, are more
important than before, but they may not be
enough to keep you safe. We
recommend using anti-malware multiscanning to assess suspicious files, as well
as a solution that will assess your system
for threats your installed antivirus may
have missed. When you receive a notification
that you need to update a key program on
your computer, like Java or Adobe Flash,
don’t just ignore it!
Many viruses rely on out-of-date
software to gain access to your machine,
so applying security updates for your
operating system and other third party
programs without delay is also a key step
to minimise the risk of malware infection
of your computer.
And finally, if you need to access your
employer’s corporate network from your
home computer, make sure you comply
with your company’s security policies! Using
the security tools that are provided to you
are more important now than ever before.
References
1 www.bbc.co.uk/news/
technology-31487258
2 www.wired.com/2014/11/countdown-tozero-day-stuxnet/
3 https://www.torproject.org/
4 www.theguardian.com/world/2015/
jan/04/north-korea-fury-us-sanctions-sony
2015 DIGITAL LEADERS
101
SECURITY
Image: iStockPhoto
FEAR, UNCERTAINTY
AND DOUBT
Egnyte CEO Vineet Jain reports
on the failed USA Freedom Act
and the possible global
consequences of its recent
rejection by the US Senate.
102
DIGITAL LEADERS 2015
It’s hard to believe it has now been over
20 months since Edward Snowden first
revealed information about the NSA’s
secret surveillance programmes to the
world. Since that day in June 2013, data
privacy and security have taken on a
whole new level of criticality.
Organisations all over the world have
had to adjust accordingly – creating
advanced security features, adhering
to more compliance protocols, hiring
dedicated security/privacy officers, and
even launching PRISM-proof products.
The US Government has taken a notable
action of putting together a new bill
called the USA Freedom Act. Few people
realise the act is a backronym, which is a
specially constructed acronym created to
fit an existing word, that stands for:
‘Uniting and Strengthening America
by Fulfilling Rights and Ending
Eavesdropping, Dragnet-collection and
Online Monitoring Act’, which was brought
into Congress just a few months after the
leak, in October of 2013. Led by Senators
Jim Sensenbrenner and Patrick Leahy,
the bill was ultimately meant to end mass
collection of Americans’ data and create
a much higher level of transparency
between the government and the public
regarding how data is being collected.
While many of the major tech
companies in the US, such as Google,
Microsoft and Facebook were in favour
of the bill, their combined market cap of
over $1 trillion and overall clout was not
enough to get the bill passed by Congress.
The bill was halted in the Senate after
falling just two votes shy of the required
60. The main opposition to the bill was
that it would tie the government’s hands in
defending the USA against terrorism.
While national security is a vital and
valid concern, it is widely agreed that
the current programmes in place are
overstepping boundaries and are in
violation of Americans’ Fourth Amendment
rights.
One step further, I see the violation
as the single ‘drop of water’ causing a
very negative ripple effect all over the
world. Businesses are suffering and more
specifically, the tech industry will see a
lot of customer churn if organisations are
unable to protect their customers’ data.
Let’s take a look at some of the effects
felt by businesses in the US and overseas
as a result of the failed USA Freedom act:
1) Europe is learning from US mistakes
As far as security, regulation and
compliance goes, Europe was already
miles ahead of the United States.
As the US suffers breaches like Target,
or the most recent Sony disaster1,
more thoroughly than anything that’s in
place in the US. As a result, American
businesses will have to comply with any
and all EU regulations before they can
even begin to think about doing business
overseas.
2) Businesses are losing money
The fastest way for a business to lose money
is to lose customers and if customers
cannot trust that their private information will
be protected by corporations, they tend to
stop doing business with those
organisations. As a direct result of a lack
of trust in American security protocols, the
slippery slope looks to be getting worse,
putting millions of dollars in jeopardy.
As a result, American businesses will have to
comply with any and all EU regulations before
they can even begin to think about doing
business overseas.
European governments are able to see
holes in the US system and address them
accordingly.
While it has been in discussion for some
time, the European Commission continues
to make progressive adjustments to
their General Data Protection Regulation
(GDPR).
This new set of regulations would
protect the privacy of their citizens much
Gary Shapiro, the CEO of the Consumer
Electronics Association, expressed his
concerns for business, both domestically
and internationally, in an open letter
supporting the USA Freedom Act, saying:
‘Several companies, including members
of CEA, have already lost contracts with
foreign governments worth millions of
dollars.’
3) Innovation is taking a back seat
By rejecting the USA Freedom Act last
month, the Senate created even more FUD
(fear, uncertainty and doubt) between
businesses and their customers. Without
realising the ramifications of their actions,
the government forced businesses into
reacting to the fallout.
It has now become the job of US
companies to spend valuable resources on
gaining trust back from their customers
and creating ways to protect them from
government spying. Considering a lot
of companies are not built or designed
for this kind of development, there is a
major shift in focus towards fire-fighting
and unfortunately innovation suffers as a
consequence.
As many still consider the US to be
in the middle of a ‘tech bubble 2.0’2, the
bubble could soon pop if we can’t find a
way to address these data privacy issues.
If we can’t come to a consensus on a bill
like the USA Freedom Act, which was a
progressive step forward, the US may be
facing its toughest global competitor yet:
itself.
References
1 http://www.nytimes.com/2014/12/18/
world/asia/us-links-north-korea-to-sonyhacking.html?_r=2
2 http://money.cnn.com/2014/11/13/
investing/nasdaq-5000/
2015 DIGITAL LEADERS
103
SECURITY
Image: iStockPhoto
GENERATION GAME
Louise T. Dunne, Managing Director at Auriga, explains why she thinks the millennial generation will
struggle to apply security and discusses what organisations can do to help them improve their security
mind-set.
Technologically savvy with a high
exposure to all types of media, Millennials
– or Generation Y soon to be joined by
Z – are highly adaptable and capable of
creating and utilising data over multiple
platforms.
They’re very comfortable with abstract
concepts, such as virtual consumerism
and the cloud, and socialise as much
online as they do in real life. But does this
equate to technological proficiency or is
it a form of dependency? What could the
knock-on effects be for the workplace?
One risk is that this technological
preoccupation could make it harder to
implement information security effectively.
104
DIGITAL LEADERS 2015
Security is absolutely not a technical
challenge; it’s about people and process.
Educating staff at all stages, empowering
them from the time they join to the time
they leave by revoking access.
Communicating and enforcing policies
and procedures to give clear guidance on
security best practice. And incorporating
those security measures into the
processes of the business to ensure it
operates securely and efficiently.
For a generation raised on technology,
the idea of putting people and process first
may seem arcane. Technological solutions
offer reassurance. But, while technology
has its place as a resource within the
security arsenal, it is often within itself
a contributory factor to risk. Without
controlled application, monitoring and
interpretation, technology solves nothing.
Take the first Sony Play Station Network
compromise.
Yes, the gaming giant applied firewalls
governance results in a lack of drive and
investment. Typically such organisations
have lots of shelf-ware and a sporadic
application of controls to those areas of
the business, such as HR and IT, where
it needs to demonstrate good security
governance.
One risk is that this technological preoccupation
could make it harder to implement information
security effectively.
and web application software but it didn’t
apply the firewalls in the right places and
the web application software was out-ofdate in terms of patching.
Throwing technology at a security
challenge is not only a waste of money
but leads to a false sense of security.
Understanding the risks and applying
governance as well as technology can
mitigate that risk. It’s a lesson many
businesses have failed to grasp until
they’ve suffered a breach.
Up until that point, they value the store
placed in governance by their customers,
but don’t go beyond the bare minimum
requirements, paying credence to a
governance framework of some type, such
as ISO27001/27001 or PCI DSS.
A lack of belief in good security
This approach that not only hamstrings
the organisation (by not having a uniformed
approach to security), but also creates a
false sense of security, especially at board
level where attitudes can often smack of
‘don’t worry, we’re compliant’.
Achieving a security-led organisation
means promoting security as a process not
a product to Generation Y. Use staff surveys,
provide a mobile app to staff and/or
customers and measure reported incidents.
Staff training programmes can then be
tailored to meet the evolving threats to the
business making them meaningful. There
is no substitute for scenario-based training
which is realistic and preferably tailored to
the business area or sector.
Getting the employee to see how
easy a compromise can happen and the
results of neglect or negligence can be
highly effective, provided such training is
performed regularly, not just annually or
upon recruitment.
In some instances, adopting a processdriven security programme will require
the company to restructure. The size of the
project depends on many variables: sector,
retrofit or green field, size, culture and
business model.
The worse case scenario would be
an organisation which is large, needs
to retrofit security in a well-established
business and is in a complex business
sector with staff who generally don’t accept
change easily. In most cases, organisations
do have some security provision although
it tends to be ad hoc and undocumented.
The lion’s share of the task then lies in
understanding the risks the company faces
and embedding good security governance.
In the future, threats will almost certainly
be greater, if not in number than in scope
and sophistication, and it’s the millennials
that will fight the good fight for us.
A coherent, corporate-wide strategy
applied to a fully understood risk landscape
is a far more effective means of staving
off a malicious attack than any technical
point solution. Our legacy is to arm this
generation with the knowledge as well as
the tools for the job.
2015 DIGITAL LEADERS
105
Image: iStockPhoto
SERVICE MANAGMENT
BACK TO THE FUTURE
Chris Evans MBCS, IT service management specialist, Service Focus, examines the service
management industry and finds it needing some realignment.
In recent times, the service management
industry has become enamoured with the
concept of ‘back to basics’. Various initiatives,
frameworks, blog posts and articles are
suggesting a return to the days when we
focused on the customer and business, and
service delivery was king.
In the last 20 years I have seen the
various cycles of thinking that come and go
within our industry and I am struggling to
remember a utopian time when everyone
was completely committed to the delivery
of service, technology was seen as purely
an enabler and the primary driver was
to deliver utility and warranty to our
customers. Maybe I rebooted and missed
106
DIGITAL LEADERS 2015
it? Maybe it got lost due to a Y2K bug?
Perhaps it’s floating in someone’s private
cloud?
The truth, or my truth at least, is that
we have always latched onto the ‘next big
thing’ and still continue to do so now. You
only have to look at the proliferation of
new words in the industry (gamification,
cloud, BYOD, big data and the numerous
XaaS, to name but a few) to realise that
we are still enchanted by the latest bells
and whistles and consumed by a desire to
deploy them as soon as possible.
This is regardless of whether the
customer wants them or not and with a
spectacular lack of concern for whether
it is in their best interests. We have
applied an ‘if it doesn’t work, swap it out’
approach not just to hardware break-fix,
but also to all areas of people, process
and technology. After all why fix a problem
when you can hide it, move it or distract
from it with another problem?
Essentially it is not a backwards step
we need but a significant forward leap in
thinking and practice. We cannot return
to a standard we have never actually
achieved, we need to work towards it.
Business alignment
The IT department as we know it is dead.
We are now the IT capability of the business,
which is and always has been the sole
reason we are in existence. We are there
to enable future plans, to protect the
current estate and to assist in the efficient
running of the day-to-day business
operations. We must ensure that we no
longer sit in an ivory tower, holding the
keys to the kingdom. In the modern world
of outsourcing, cloud provision and
everything and anything-as-a-service it
is naïve to assume that we are the only
game in town and therefore irreplaceable.
We must work to be a trusted business
partner, to demonstrate through delivered
value that we are a core part of the
business and therefore integral to its
roadmap and forward-planning. Only
by becoming a key component of the
decision-making process can we futureproof our existence.
Breeding for success
We need to breed service people.
Technical experts locked in basements,
deprived of sunlight and human contact,
dream. Buying in skills from the outside at
inflated rates and then watching them walk
out the door when a better offer comes
along is detrimental to service delivery
and leads to stagnation of service, lack of
promotional opportunities and an inevitable
demoralisation of existing, often highly
loyal employees.
We need to bring back the ‘old school’
concept of apprenticeships, a structured
development programme that nurtures
the professionals of the future, exposing
them to a range of activity both inside and
outside of our business area to make them
well-rounded and multi-skilled.
Additionally we need to work on
qualifications, to come together as an
industry and centralise our recognition
systems, giving them real credibility and
a marketing strategy that gives them
recognisable value to the individual, their
peers and future employers globally.
Adapt and overcome
We need to be more agile. I am not just talk-
We are now the IT capability of the business,
which is and always has been the sole reason
we are in existence.
should be a thing of the past and, where
still in existence, should be hunted down
and removed with all haste. The people
we require for the IT function of the future
will of course require a good knowledge of
their subject area but beyond that, and far
more importantly, they will have the right
attitude.
We need intelligent, customerfocussed innovators who deliver service
as minimum and who see quality delivery
and service improvement as a daily
requirement rather than an aspirational
ing about the PM methodology here, but our
thinking. We need to be able to flex, to roll
with the punches, to understand that the
business world does not stand still. To use
an old expression: ‘you snooze, you lose’.
We need to structure our service
delivery with the right level of governance
to be in control of course, but not to
a level that stifles innovation. A key
element of this is to enhance our ability
to embrace change. We are here for the
business and as such we should position
ourselves to take advantage of change as
an opportunity. For example, I have heard
many organisations talk about BYOD as
a risk to security and an administrative
nightmare. Nobody has, in my earshot,
talked about reducing the total cost of
ownership for hardware by using users’
own devices. Nor have they mentioned
the enhanced customer experience or the
reduction in support load as they are using
devices they are already familiar with.
We also need to move away from
protecting our jobs as they stand and look
towards making ourselves indispensable in
the brave new world. For instance, social
media opportunities for support are now
threatening the existence of the traditional
service desk. Do we therefore reject these
new technologies to protect roles or do
we embrace them and look to create new,
exciting roles and opportunities out of their
adoption?
The delivery mechanism of choice?
Service management is here to stay but
if we are to stay with it, we cannot rest on
our laurels and assume that we will always
be the delivery mechanism of choice.
There is a strange correlation in that
as IT becomes ever more critical to daily
life, those who design, deliver and support
it seem to make themselves less relevant.
Users are becoming ever more skilled and,
in their desire to improve delivery, they can
and will now bypass us to get the service
they require. If we are to remain current
and in demand we must offer services that
they cannot obtain themselves or deliver
them in a way to make us more attractive
than the DIY route.
The future is here. The landscape is
changing and unless we change with it and
reach a new level of service delivery, we
will become extinct.
2015 DIGITAL LEADERS
107
SERVICE MANAGMENT
Image: iStockPhoto
DOING IT WRONG
Steve Sutton CITP MBCS, IT
Manager, Infrastructure Europe, Optimal Payments, takes
a look at some of the common
misconceptions
regarding ITIL and what it means
to be ‘compliant’.
When asking IT staff about their view on
ITIL you may frequently hear the following
(puzzling) responses:
‘Yes, we are ITIL compliant’.
‘It’s a load of rubbish, it doesn’t work (or
‘we tried it and it didn’t work’).
‘It’s too much effort’.
All are sure-fire indicators that the
fundamentals of ITIL have been ignored
and that the respondent hasn’t fully
understood the basis on which it is founded.
ITIL is a best-practice framework, it is
108
DIGITAL LEADERS 2015
not prescriptive. That is to say: it does not
tell you what you must do, or how to do it.
You cannot therefore be ‘ITIL-compliant’, in
the way that you can be ‘PCI-compliant’ or
‘ISO27001-compliant’.
The purpose of ITIL is to provide a
generic structured approach to the
running of an IT department. It is based
essentially on a list of things you probably
should be looking at.
It proposes that you have certain things
in place, such as a service desk or change
management, but it does not tell you
how these things must be implemented;
there is no ‘must’. The driving principle
is to adopt and adapt what works for
your organisation, at this time, in this
environment.
Equally, for those that say: ‘ITIL doesn’t
work’, you’ve missed the whole point.
Adopt and adapt! If you find that an
element doesn’t work for you, don’t use it
- after all, it’s only a suggestion. In reality,
given the extensive international use of
ITIL and the volume of IT professionals
that have made it work, it is probably not
the proposal by ITIL that doesn’t work for
you, but more likely your implementation
or your interpretation of the proposal that
is wrong.
As for ‘too much effort’; it isn’t. It’s as
much effort as you can spare or want to
put into it. You don’t have to do it all in
day one or even in year one! Although
one could argue that if following best
practice is too much effort you have more
fundamental problems to resolve first.
Everything contained within ITIL, more
Apply what works, change what doesn’t,
go as far as you need to, stop, reflect, think
about what is and isn’t working and make
changes where needed. After all one of the
sections of ITIL is continuous improvement
- a key theme in all management teachings
both academic and technical.
It is possible (but unlikely) that you could
end up adopting none of ITIL, or adapting
it so much it hardly represents the original
constructs. At least in this scenario you
have taken the time to look at a bestpractice framework, evaluate it for your
situation and make a conscious decision
to not adopt it. In theory, you could be
‘ITIL-compliant’ (whatever that means) by
officially agreeing not to implement ITIL at
all! You have successfully adopted what
is good for you (none of it) and adapted by
doing something else.
Another common mistake seems to
be confusing ITIL (a framework for best
practice in IT) with other schemes and
making the assumption that you cannot do
It is not the panacea to all of IT’s problems, which
is why you need experienced, qualified (hopefully
chartered) and intelligent IT professionals.
so the earlier versions, is common-sense
‘should be doing it anyway’ stuff, so why
would you not want to do it?
Like with all frameworks, if you follow
them blindly without applying your
knowledge to them, you will end up with a
wonderfully bureaucratic scenario where
everything has a procedure, a form to
complete and a tick in a box.
You’ll spend more time operating the
process than doing the work and that
won’t get you anywhere. That’s why it’s a
framework, not a prescriptive set of rules.
more than one. It is perfectly possible to
follow ITIL guidelines on having a service
desk and incident management backed
by a configuration management database
(CMDB), whilst handling the tickets within it
in an agile Kanban (a self-defined, just-intime, low work-in-progress environment)
manner. This scenario could then be within
the scope of an ISO27001 (prescriptive)
and PCI-DSS (prescriptive) accredited
environment.
It is interesting that frequently those in
the ‘ITIL doesn’t work’ camp are those that
would most benefit from ITIL influence –
i.e. the kind of IT teams that are constantly
reacting instead of being proactive, the
kind of teams that are in a constant state
of panic as one priority pushes out of the
way the last before it is completed.
ITIL does work; you just have to make it
work for you. In order to do this you need
to understand both the operations of your
IT department and the wider business and
you also need to understand the nature of
the IT environment.
IT is a somewhat unique environment
where technology strategy must
drive continuous improvement or the
infrastructure will not be able to support
the very core technologies that keep the
business running.
Keeping up with technology (servers,
desktops, security, storage, email, internet
etc.) and delivering value for the business
(new systems, applications, customer
interfaces, websites, software development
etc.) means that both elements need to be
maintained and much of the infrastructure
work is hidden (rightfully so) away from the
wider business. Unfortunately this means
the wider business is also not aware of the
demands and pressures put upon the IT
department.
ITIL provides a guideline framework only
and, when applied at a strategic level, it
will give structure and direction to those
seeking to maintain, improve and excel at
IT service support and service delivery.
It is not the panacea to all of IT’s
problems, which is why you need
experienced, qualified (hopefully
chartered) and intelligent IT professionals
to decide what is appropriate, what can
be done, what should be done and most
importantly... what should not be done!
2015 DIGITAL LEADERS
109
SERVICE MANAGMENT
Closeness
Business
Need
Total
Business
Experience
Service
Specification
Agility of response
Change
Alignment and
governance
management practice has developed over
the last 50 years the success rate has
remained at about 35 per cent meaning
that globally some $2.5 trillion of IT spend
ELIMINATING WASTE
Dr David Miller FBCS, management consultant and Managing Director of ITDYNAMICS, discusses the
benefits of having an automated IT service management system supporting active management.
Image: iStockPhoto
Recent research1 shows that the way that
IT has always been managed is itself often
the cause of investment failure.
It is also suggesting an approach to
IT management and governance that is
provably correct, makes sense to business
people, helps manage complexity, maps to
110
DIGITAL LEADERS 2015
current IT methods, links business agility
to agile IT, and remains valid in the context
of future technologies.
Importance and urgency
Businesses invest in IT to stay competitive
but the success rate is poor. Whilst IT
and future complexity makes defining
and testing product requirement difficult,
if not impossible, except at run time2.
It was necessary to get back to basics
Whilst IT management practice has developed
over the last 50 years the success rate has
remained at about 35 per cent.
is currently at risk annually.
To improve this success rate the
emphasis has always been on improving
the modus operandi involved in building
and operating IT product, i.e. the IT
methods, standards, best practice, and the
extent to which these are implemented in
any given situation. Instead the research
questioned the underlying IT delivery
model of ‘define, build and operate’.
It assumes that this has always been
inadequate in some way and will become
less relevant in the future as emerging
by understanding the nature of the true
relationship between businesses and
IT services organisations, unfettered by
traditional practices and customs.
The true relationship between business
and IT
The research revealed the true
relationship between business and IT,
which is fully described by the Business
and IT Relationship Model3 (BITRM).
Central to the relationship is the ‘total
business experience’ (TBE), which is also
the core category, i.e. it accounts for the
most variation around the problem of IT
delivery.
On the horizontal axis it is positioned
between the ‘business need’ and the
‘service specification’, reflecting the broad
spectrum of business requirement and
how well services and needs are aligned;
‘closeness’ is a measure of alignment.
On the vertical axis the TBE is positioned
between ‘change’ and ‘alignment and
governance’ and reflects how well the
business and the IT services organisations
are responding to the changes taking
place around them. ‘Agility of response’ is
a measure of the time taken to respond to
change or a need.
In this model the needs of the customer
and consumer are the principal quality
driver4; business people judge services
by making sense of what they experience
relative to their needs. Most business
decisions about IT are made in this way.
2015 DIGITAL LEADERS
111
SERVICE MANAGMENT
Image: iStockPhoto
Reflecting on current IT management
methods
Whilst the BITRM is a correct and complete
frame of reference for describing the
complexities of the relationship between
business and IT services organisations, it
doesn’t always reflect the behaviour of the
people. Behaviour patterns, constrained by
perspectives and current practices, create
blind spots; in worst case situations these are
seen to weaken every link within the BITRM.
As this occurs, IT decision-making
becomes more difficult and IT management
itself can appear weak. In practice this
shows how the current IT delivery model
can create the very conditions that give rise
to low success rates.
There are two constraints associated
with the current IT delivery model that are
worthy of note. Firstly the current delivery
112
DIGITAL LEADERS 2015
model maintains a product and production
focus, which is not consistent with the
spectrum of business requirement, so
limiting what can be achieved during
alignment.
Secondly the current IT delivery model
is driven by methods and standards
that are explicitly or implicitly based on
a Deming or Shewhart-like: ‘plan, do,
check and act’ (PDCA) cycle; a production
concept for continuous improvement and
operational control.
This became the bedrock of total quality
management (TQM), and all IT methods
and standards are explicitly or implicitly
based upon it whereas the BITRM reflects
the dynamics of the true relationship.
New IT delivery model
The blind spots and other delivery
weaknesses can be avoided by using the
BITRM proactively to deliver a managed
business experience. Referred to as active
management, it uses the BITRM’s heuristic,
deterministic and controlling characteristics.
These characteristics enable service
quality to be more correctly described,
monitored and controlled relative to the
business need. This provides an acid test.
Where alignment was previously considered
to be uncertain5 or elusive6, now alignment
or otherwise can be proven, the impact
on the business can be determined and
solutions can be actively targeted.
The TBE is sensitive to service
deterioration, new needs gaps and
external change. ‘Alignment and
governance’ becomes a key control point
for the business, and closeness and agility
become key business measures.
The new IT delivery model is essential
to those managing real-world complexity,
ambiguity and dynamism7 where the
traditional management, alignment, and
governance mechanisms have never been
as straightforward as has been implied by
the methodologies8.
things, self-adaptive/self-managing systems,
cognitive informatics and so on. It maps
to the current IT methods and standards
whilst encouraging the agile agenda, new
collaborative management practices, new
contractual frameworks for IT supply and
automated IT service management
systems to freely develop.
For businesses, and for the
management and governance of large
or complex business and IT landscapes,
the benefits of an automated IT service
management system supporting active
management are expected to be large and
a prototype is under development.
Waste will be reduced and, whereas IT
governance has always been described as
a process imposed by IT9 and undertaken
by IT staff, active governance can become
a business responsibility.
Please note: the terms ‘active
management’ and ‘active governance’, rely
on strict definitions and philosophies not
fully discussed above and should not be
used by others except to refer to the work
described here.
The current IT delivery model can create the
very conditions that give rise to low success
rates.
Looking forward
Active management provides an explicit
layer of business control over IT services
that has not always been present, and it
will remain valid in the context of future
technologies, for example, the internet of
References
1 Miller, D., Maximising the Business
Impact of IT: Importance of managing
the total business experience, School of
Science and Technology, 2013, Middlesex
University: London.
2 Kephart, J.O., & Chess, D.M., 2003. The
vision of autonomic computing. Computer.
IEEE Computer Society.
3 Miller, D., & Woodman, M., Service
models for IT management, IT alignment
and IT governance, The UK Academy
for Information Systems 2011, Oxford
University.
4 Vouk, M.A., 2008. Cloud Computing –
Issues, Research and Implementations.
Journal of Computing and Information
Technology.
5 Henderson, J.C. & Venkatraman, N., 1993.
Strategic alignment: Leveraging
information technology for transforming
organisations. IBM Systems Journal, 32.
6 Chan, Y.E., & Reich, B.H., 2007, IT
alignment: what have we learned? Journal
of Information Technology
7 Gummesson, E., 2002, Relationship
marketing and a new economy: It’s time
for de-programming, Journal of Services
Marketing.
8 Silvius, A.J.G., 2007, Business & IT
Alignment in Theory and Practice. 40th
Annual Hawaii International Conference on
System Sciences.
9 De Haes, S., & Van Grembergen, W.,
2005, IT Governance Structures, Processes
and Relational Mechanisms: Achieving IT/
Business Alignment in a Major Belgian
Financial Group, 38th Hawaii International
Conference on System Sciences.
2015 DIGITAL LEADERS
113
SERVICE MANAGMENT
Image: iStockPhoto
with experts everywhere, you should ask
yourself – do your customers really need
to use your IT services? And if they stop
using your services, what will happen to
your IT department?
However, it is not enough to realise the
team’s value, the IT leadership team needs
to tell everyone in the organisation of the
value of that team, and to reinforce that
message whenever possible.
In one organisation, once a year the IT
leadership team would man the phones
for a morning. Not only did this give that
leadership team a sense of what the
delivery team goes through, but also
provided management with a sense of
their value.
FIVE STEPS
Michael Moulsdale FBCS reveals five steps to improving your internal IT service delivery team.
The IT service delivery team will be to
many people within your organisation the
face of the IT department. And it is the
people on the service desk or those that
carry out desktop support that make up
that service delivery team.
It is therefore crucial to have a highperforming team to present IT in the best
light. However, it is too often the case that,
no matter how good the infrastructure
systems in-house applications are, IT is
complained about, and often this is down
to the perceived quality that the IT service
delivery team is providing.
Often this lack of perceived quality
service is not the fault of the service
114
DIGITAL LEADERS 2015
delivery team, but the resources that are
provided to them and the team’s or the
resource’s role within IT. Often the team
is told to ‘just get on with it’ and provide
a better level of service; however there
is more to it than that. In order to turn
around a poor-performing service delivery
team, or to create a new team, these five
steps can be taken.
Step 1: Recognise their value
As with all change programmes, the first
step involves buy-in and support from the
leadership teams. That leadership team
therefore needs to recognise the value of
the service delivery team. They need to be
shown that it is the service delivery team
that is the face of IT within their business
and have a greater effect on the perception
of IT than any other team.
Take a page from the business side of
your business: To gain a new customer
costs a lot more than retaining one,
and there have been studies that show
disproportionately better profits are
gained when customers are retained
rather than having to attract new ones. It
may be argued that internal customers
are required to use the internal IT team
and so the analogy does not hold. But in
this day of BYOD, online email services
and the ubiquity of computing at home,
Step 2: Get closer to the customer
The natural next step from the previous
point is to get closer to the customer i.e.
the business user. Often the complaint
from business users is that IT does not
understand the pressure they are under,
or don’t understand the importance of
such and such external customers or
relationships.
This is an easy issue to fix and one that
is the responsibility of the IT leadership
team. It is obvious that the better a team
understands its customers, the better
mechanisms can be thought of.
For larger organisations it helps to have
members of the desktop support team to
be dedicated to one department or another
so as to provide the most appropriate
service for that community.
Step 3: Get closer to the rest of IT
As has already been mentioned there can
often be a sense of superiority by other IT
teams towards the service delivery team,
and this can at least create tension and at
worst resentment and open hostility. Not
only does this harm the service delivery
team’s ability to do their job effectively, but
it also weakens the other team’s ability to
provide appropriate products and services
to the business.
To resolve this, a couple of initiatives can
be run.
•
•
Involve the service delivery team in all
projects at an early stage. Due to their
day-to-day interaction with customers
they can provide an invaluable insight
and will certainly provide a lot of goodwill
when it comes to rolling out the project.
As with dealing with the business, get
the infrastructure and development
teams to carry out regular training
Do your customers really need to use your IT
services? And if they stop using your services,
what will happen to your IT department?
they can support them. This can be done
through basic awareness sessions and
seminars or attending more detailed
sessions on the products and services that
the organisation provides.
It is, however, crucial that this is
not limited to an HR-run orientation
programme when an employee joins
the organisation. The IT leadership team
will need to work closely with their
counterparts in other parts of the business
to set up seminars, lunchtime discussions,
online training and whatever other
sessions with the service delivery
teams about the IT products and
services that they provide. Not only
will this provide better knowledge for
the service delivery teams, but will
often allow those teams to give valuable
feedback. In addition, tickets will often
be routed more appropriately.
Step 4: Communication, communication,
communication
As with so many other aspects of IT and
business communication is key. Due to
how busy service delivery teams are,
especially desktop support team members,
it can often be difficult to understand what
is going on.
A solution I’ve seen is to have a daily
stand-up meeting. The purpose of the
meeting is to go through any major tickets
of the previous day so that all the team
understands what is going on.
In addition better information can be
provided to business customers. This
should include better information on how
to use systems, as well as information on
how to resolve common problems. This
can be in the form of static intranet pages,
hand-outs or videos showing the steps.
This kind of information can both reduce
the amount of tickets received by IT, but
also improve the perception of IT by the
business.
Step 5: Analytics
It is one thing to have a good relationship
with the business and to close tickets in
an expedited manner, but the real key to
deliver high-quality service is being able
to first carry out analytics on those tickets
and to then act upon those analytics. In
addition, with well-created reports on the
team’s activities that can be presented to
the board, the demands of and the work
done by the service delivery team can be
easily understood by other senior members
of the organisation.
The last point is important not just in the
act of laying to rest any poor perceptions of
the team, but also to demonstrate why the
resources being asked for are necessary.
As can be seen from the five steps, it is
not enough to use the equation of ‘tickets
per month x half an hour’ to work out how
many staff are needed. It is also necessary
to factor in training sessions, document
creation and analytics.
With all of these points carried out, you
are on the right track to having a highperforming and motivated service delivery
team.
2015 DIGITAL LEADERS
115
SERVICE MANAGMENT
THE FUTURE OF
SERVICE MANAGEMENT
Image: iStockPhoto
Service management has been maturing over the last decade. Processes are more streamlined and
tools are available to support them. However service management faces new challenges as technology
extends further into business-facing and complex environments. Shirley Lacey MBCS, Managing Director
of ConnectSphere, reports.
The future IT-enabled services and move
to shared services will impact our strategy
for service management.
The move to flexible working, further
consumerisation of IT and the adoption
of new technologies like smart devices,
BYOD, cloud and mobile, analytics, big data
and the disruptive internet of things will
affect the business landscape and drive
transformation.
Gartner forecasts that 4.9 billion
connected things will be in use in 2015, up
30 per cent from 2014, and will reach 25
billion by 2020...
Gartner’s top 10 technology trends that will
be strategic for organisations in 2015 are:
1. Computing everywhere
2. The internet of things (IoT)
3. 3D printing
4. Advanced, pervasive and invisible
analytics
5. Context-rich systems
6. Smart machines
7. Cloud/client computing
8. Software-defined applications and
infrastructure
9. Web-scale IT
10. Risk-based security and self-protection
116
DIGITAL LEADERS 2015
Strategic development of services
Both business units and IT departments
are investing in digital, smart devices
and sharing information across the value
chain. They need to engage and work
together to ensure maximum return on
investments - working in silos will hinder
progress. They also need to decide what
role each party should play.
To develop new customer opportunities,
the business units, enterprise architects,
IT and service management professionals
and information security need to work
together from the concept stage. To
be ready for significant changes to the
current operating model, service providers
should continue to build their capabilities
in service management, especially service
strategy best practices.
Governance
With ecosystems that allow billions of
devices to connect, communicate and
interact, increased collaboration and
governance across value chains will be
necessary. Standardisation and secure
methods for ensuring interoperability and
information security will also be required.
As organisations transform their
business models and technology, it is
vital that business owners and heads of
technology delivery units lead and work
to the same goals. Departmental key
performance indicators (KPIs) should be
set to ensure buy-in from all areas.
Service integration and management
Service integration and management
enable an organisation to manage its
service providers in a consistent and
efficient way across a portfolio of multisourced services and goods.
Service integration models, such
as SIAM (service integration and
management), are evolving from
managing a small number of large
suppliers to managing a greater number
of smaller suppliers. The degree of
service integration varies depending
on the complexity of the services and/
or customers being supported. For the
model to be effective the service portfolio
including component services needs to be
well-defined and understood.
adapt to the needs of a radically different
workplace.
As transformation happens, people
regularly change their roles. Organisations
need to identify the roles and responsibilities,
knowledge and skills needed for each role
and ensure that there is continuity when the
role changes or transfers.
Service management professionals
will need to develop their skills. For
example, they need to learn how to focus
on delivering business outcomes, superior
customer experience and service quality.
Enterprise architects, business
analysts, user experience, project teams,
service designers, security and other
service management professionals will
need to work together to model, design,
integrate and deliver and improve complex
environments and end-to-end services.
CIOs and service managers will need
to take on different and multiple roles
to coordinate ICT activities. These might
include collaborating and aligning with
marketing and business executives or
being a cloud service broker to negotiate
relationships between cloud service
customers and cloud service providers.
Globalisation and standards adoption
Adopting international standards enables
globalisation, trade and service integration
across the supply chain. Organisations
•
•
•
ISO/IEC 17788:2014 is an overview
of cloud computing, terms and
definitions.
ISO/IEC 17789:2014 provides
a cloud computing reference
architecture (CCRA).
ISO/IEC 20000-9 promotes guidance
on the application of ISO/IEC 200001 to cloud services.
Future of the service management standard
In 2014, ISO established SC40, a new
standards committee responsible for
standards and guidance on IT service
management and IT governance. At a
recent meeting, national body representatives
considered the future of the ISO/IEC 20000
series on service management.
Key strengths were identified as:
• many organisations have gained
huge benefits by using ISO/IEC
20000;
• clear support for the end-to-end
supply chain;
• continual service improvement is
Failing to standardise will make cloud services
more complex to manage. Fortunately, ISO has
just published standards for cloud computing.
have been adopting the following ISO
standards:
•
•
•
Changing roles in the new world
To succeed in the future, managers must
be more agile, responsive and able to
experience to manage and deliver services
for the business and customers.
Failing to standardise will make cloud
services more complex to manage.
Fortunately, ISO has just published
standards for cloud computing:
ISO/IEC 38500 Governance of IT;
ISO/IEC 20000 for service
management;
ISO/IEC 27000 for information
security.
ISO/IEC 20000 is based on practical industry
•
•
•
embedded into the ways of working;
a technology-independent standard;
implementation enables service
providers to enable rapid change
safely;
Alignment with ITIL1 service
management (that provides how to
guidance).
of the ISO/IEC 20000 series were identified
as follows:
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
Governance of services
Risk management
Service strategy management
Requirements management
Design and transition of new or
changed services – aligned with best
practices
Customer perspective and user
experience
Measurement and reporting of service
levels, availability, business and
customer value delivery
Supplier management - mapping and
control of the supply chain
Knowledge management.
More focus on developing people and
their competencies.
Managing all the new IT-enabled services
with new operating models and billions of
devices in new ecosystems will put new
requirements and challenges on service
management. Service management needs
to aim to:
•
•
•
•
•
ork collaboratively in a smarter
w
IT-enabled world to meet business
and commercial needs;
govern services across multiple
suppliers to avoid any ambiguity about
the boundaries of both responsibilities
and accountabilities;
design and measure IT-enabled
services within the larger context of
customer and organisational outcomes;
start building and improving capabilities
in service strategy, service design and
transition;
create a vision and strategy for
service management for the new
world including automation.
Reference
1 ITIL is a registered trade mark of AXELOS
Ltd.
An initial list of priority areas for the revision
2015 DIGITAL LEADERS
117
SOFTWARE DEVELOPMENT
Image: iStockPhoto
TRUSTWORTHY
SOFTWARE
Alastair Revell MBCS, Director
General of the Institution of
Analysts and Programmers,
takes a long, hard look at the
state of software today and
suggests that early adopters of
PAS754 might have the upper
hand in years to come.
118
DIGITAL LEADERS 2015
It is undeniable that software now underpins
many aspects of national life. In many
respects, we have sleepwalked into a
situation where we, as a society, are now
heavily dependent on software.
We take for granted that it keeps
airliners in the air, facilitates the stock
exchange and money markets, helps
clinicians provide better care and keeps
the wheels of ecommerce spinning.
It also has crept rapidly into our private
lives, running our phones and TVs. It
collates our music and keeps us all
connected with social media. It generally
works and when it doesn’t, we shrug our
shoulders and say: ‘that’s technology for
you’, but is this a sustainable situation? It’s
an important question for those of us in
positions of leadership within IT.
There is no doubt that our reliance on
software is only set to continue and this
is surely good for those of us developing
software. However, we need to understand
that with all this good comes considerable
responsibility.
In particular, we need to be aware of
an issue that is slowly creeping onto the
radar of those whose lives depend on
technology. It’s the question of whether the
software is actually trustworthy.
A significant event happened on 10
June 2014. Publicly Available Specification
(PAS) 754: Software Trustworthiness
– Governance and Management
Specification, which was facilitated by
the British Standards Institute, was
launched. The PAS was sponsored by
the Trustworthy Software Initiative (TSI),
which is a joint venture between HM
Government’s Department of Business,
Innovation and Skills (BIS) and the
Centre for the Protection of the National
Infrastructure (CPNI).
The importance of this new PAS to the
UK Government was underscored by it
being formally launched in London by the
then Minister of State for Universities and
Science, The Rt Hon David Willetts MP,
before an invited audience drawn from
industry and academia.
Consumers, and quite clearly HM
Government, are beginning to realise just
how dependent they are becoming on
technology. This trend will only continue
and deepen as we hear of more and more
IT fiascos. It worries people and unnerves
them. People are beginning to shift from
accepting that software has its flaws to
their TV system occasionally crashing or
their phone freezing on them because of
these shoddy practices. However, deny
them access to their money and they
become far less tolerant. Availability is an
increasingly significant issue.
Reliability is another aspect that is
climbing up the agenda. The ability of
poor-quality software to disrupt not just
our own lives but the whole of society will
increasingly drive people to demand better.
It won’t be long before computers
are embedded into cars that are just
another device on the internet of things
along with power meters and a host of
other sensors, all open to hacking and
exploitation. When computer failures
start to impact on lives on a regular
Consumers, and quite clearly HM Government,
are beginning to realise just how dependent
they are becoming on technology.
increasing anger that there should be
any at all. Is the writing on the wall for
untrustworthy software development?
Part of the problem is the overoptimistic selling of solutions that never
were going to be delivered on time or
to budget. Deals are closed with naïve
buyers who are promised everything and
anything they want. The unrealistic budget
is consumed early in the project cycle and
when the product comes close to release,
it generally works, so why fuss over a few
cut corners and buffer overrun issues?
People, it would appear, can tolerate
basis, then trustworthy software will be a
prerequisite. Indeed, safety concerns will
become very important. Whenever I speak
to IT professionals and the topic turns to
regulation, everyone seems agreed that it
will be our ‘first bridge collapse’ that will
force the issue.
HM Government, concerned with our
welfare and the security of the realm, are
deeply concerned with resilience. Just how
resilient is our national infrastructure?
Perhaps not as resilient as it should be
since those procuring software for the
state have repeatedly got it wrong as far
as large-scale IT systems go. It’s not just
government – bank systems don’t seem
too resilient either if recent issues are
anything to go by.
Security is increasingly high on the agenda,
even amongst celebrities who have found the
systems they once trusted were not so secure.
Most organisations are far from focused on
security. It’s burdensome. Consumer attitudes,
though, are beginning to change – sharply.
People are beginning to realise that all the
free services they depend on so heavily
have often cut corners. As consumers begin
to understand and (sadly) experience how
vulnerable they are to insecure systems and
cyber-attack they will soon be asking poignant
questions about how trustworthy the software
they rely on actually is.
The PAS754 specification focuses
those involved in commissioning and
delivering software on the five facets of
trustworthy software: safety, reliability,
availability, resilience and security. Each of
these facets is likely to become a driver in
software purchasing in the nearing future.
I would urge software developers to look
at this PAS. If you accept that technology
is becoming more pervasive and society’s
dependence on software more critical,
then you will clearly realise that those who
are early adopters in this space will be the
long-term winners.
If you are already doing the noble thing
then now is your opportunity to stand out
from the crowd. The writing is on the wall.
The question is, can you read it yet?
2015 DIGITAL LEADERS
119
SOFTWARE DEVELOPMENT
CLOUD-BASED
SOFTWARE SYSTEMS
The ‘Cloud’ changes everything for organisations building and operating their own internet-connected
software systems. However, many organisations are unaware of the scale and nature of the changes in
technology and teams needed to take advantage of the cloud in order to remain current and competitive.
Matthew Skelton MBCS, Co-founder and Principal Consultant, Skelton Thatcher Consulting, reports.
Image: iStockPhoto
Elastic, programmable, on-demand
infrastructure was pioneered in the late
2000s by Amazon with Amazon Web
Services (AWS) and has since taken up by
numerous additional cloud providers,
notably Microsoft Azure, but also
Rackspace, Digital Ocean, virtualisation
innovators VMware and many more.
The lead time for IT infrastructure
changes in the cloud can be mere minutes
or seconds compared to lead times of
days, weeks or months for traditional
on-premise or datacentre-based
infrastructure.
This step change in the time taken for
infrastructure changes – along with the
greater control, visibility and arguably
security afforded by programmable
‘infrastructure as code’ – has huge
ramifications for organisations building
and operating differentiated software
systems and services.
Faster delivery comes at a price worth
paying
The pre-cloud model of software delivery
assumes that the order and speed of new
features has little impact on the longerterm viability of the software system.
This ‘fire and forget’ approach is not
suitable for ever-evolving cloud-based
software systems. Instead, the continuous
innovation practiced by an increasing
number of organisations requires ‘delivery’
to include all teams in the value stream,
including IT operations and service desk staff.
Techniques and practices such as
continuous delivery and DevOps work
well precisely because ‘hand-over’
between teams is avoided in favour of
in-team responsibility for both innovation
and improvements based on feedback
120
DIGITAL LEADERS 2015
from the software in operation. Such
approaches may not be cheaper in terms
of head count, training or salaries, but the
resulting approach – seeing IT as ‘valueadd’ – can be much more effective than
one that sees IT purely as a cost centre, as
recent studies by Gartner, Forrester and
others have shown.
Lean techniques have shown that by
limiting the work in progress (WIP) through
teams, organisations can achieve more
in a given length of time, but this requires
discipline from the commercial or product
teams and a clear strategy for product
evolution. Cross-programme prioritisation
with limited WIP is well-attuned to cloud
software.
Operability should be a first-class concern
The cloud requires us to continue to care
well beyond the initial deployment of a
system into production.
This means that project-based funding
models that prioritise user-visible features
over operational concerns will tend to
harm the effective evolution of cloudbased software systems.
Both user-visible and operational
features (often labelled unhelpfully as
‘non-functional requirements’) must be
prioritised from the same backlog and by
the same product owner, who must also
be measured on both the user-visible
features and the operational success of
the system.
In parallel, people in an operational role
must be considered as stakeholders with
specific needs and treated as another
customer by the software development team.
Two key operational criteria essential
for cloud software are deployability
and monitorability. The capability to
deploy new software in a rapid, reliable,
repeatable and recurring manner is
absolutely fundamental to a successful
cloud-based software system. Allowing
deployability to shape the design of
software is no more or less strange
than allowing deployability to shape
the design of temporary bridges in a
battlefield situation: the changing temporal
requirements drive the architecture.
Likewise, the need for monitoring is
paramount in cloud-based systems, partly
because we simply cannot predict all
behaviours when dealing with an openended complex distributed system (as
cloud-based systems increasingly are).
Cloud software needs dedicated
monitoring features such as diagnostic
endpoints along with auxiliary tooling such
as metrics collection: these cannot be
retrofitted as an afterthought, but must
inform the system design.
A focus on operability as a first-class
concern has benefits not only for the
operational success of the cloud systems
but also the collaboration between
software development teams and IT
operations teams, who during the past
15-20 years have grown increasingly apart
in culture and focus.
By using technical tooling in innovative
settings – such as log aggregation tools
for software debugging in development
as well as for production diagnostics in
operations – and techniques such as Run
Book collaboration, we can nurture shared
practices between development and
operations for more effective and reliable
systems.
Isolated activities
Focus on delivery
Teams in silos
Close collaboration
Monolithic designs
Lighweight components
Figure 1. Changes to technology and teams needed for cloud systems Copyright © 2014 Skelton Thatcher Consulting Ltd
Team structures dictate systems
architecture
One of the most astonishing realisations
about software systems – first articulated
by Mel Conway in 1968 – is that an
organisation building software is
between teams, we have a much better
chance of building systems that work well
in production and feel natural to evolve
over time.
That is, if we know we need to be able
to deploy different parts of the system
Team structures must match the required software
architecture or risk producing unintended designs.
constrained to produce designs that reflect
the communication structures of the
teams within the organisation: Conway’s
Law. This has huge ramifications for
organisations building differentiating
software systems. It suggests that large,
up-front designs by software architects
are doomed to failure unless the designs
align with the way in which the teams
communicate.
In turn, it means that by restructuring
teams and facilitating (or potentially
deliberately limiting) communication
independently with a short lead time, and
we decide to use small, decoupled services
in order to do so, we need to make our
teams similarly small and decoupled, with
clear boundaries of responsibility.
Approaches to building and operating
cloud software such as micro services
accommodate the need for human-sized
chunks of functionality alongside enhanced
deployability. Teams have a greater chance
of innovating and supporting a system
if they can understand the constituent
parts and feel a sense of ownership over
the code, rather than being treated like a
worker on an assembly line.
In order to take advantage of the
many benefits of cloud technologies,
organisations that develop and operate
their own software systems must set up
their organisations radically differently
from how it was done in the past. Team
structures must match the required
software architecture or risk producing
unintended designs.
Software must be designed with
operability as a first-class concern, with
the ability to deploy and monitor the
systems being core to the software’s
features; otherwise the long-term viability
of the software will be endangered.
Finally, the way in which the development
and operation of the software systems
is funded must encourage collaboration
and treat IT as adding value rather than
incurring cost.
2015 DIGITAL LEADERS
121
SOFTWARE DEVELOPMENT
MAGEEC
Compiler
FASTER, GREENER
GCC
Image: iStockPhoto
PROGRAMMING
Compilers take computer programs and translate them to the binary machine code processors actually
run. Jeremy Bennett MBCS, Embecosm, BCS Open Source Specialist Group, takes a look at two recent
industrial research projects supported by Innovate UK, the government’s innovation agency, which
advance the state of the art with compilers such as GCC and LLVM.
MAchine Guided Energy Efficient
Compilation (MAGEEC)
A study carried out by James Pallister
at Bristol University and funded by the
UK compiler development company
Embecosm in summer 2012 found that
choice of compiler optimisation flags had
a major effect on the energy consumed by
the compiled program. The bad news was
that the options to be used varied from
architecture to architecture and program
to program1.
The MAGEEC program (www.mageec.
org) was funded by the Technology
Strategy Board (now Innovate UK) under
its Energy Efficient Computing initiative,
to develop a machine-learning-based
compiler infrastructure capable of
122
DIGITAL LEADERS 2015
optimising for energy.
Running from June 2013 to November
2014, it was a joint feasibility study
between Embecosm and Bristol University,
to develop a machine-learning compiler
infrastructure that could optimise
for energy. Key criteria were that the
infrastructure should be generic, that it
should optimise for energy, that it should
be based on real energy measurements,
not models, and that it should create a
fully working system. The entire project
was free and open source.
MAGEEC needs just two interfaces to a
compiler, which are typically provided by
the compiler plugin interface.
First it needs to know the
characteristics of the source code. This
sort of language analysis is exactly what
the front of the compiler does, so it uses it
to output this information. MAGEEC defines
the set of features we are interested in,
and the compiler plugin must provide this
information - information like the number
of variables in a function, the number of
basic blocks and so on. At present we have
55 features defined, and any program can
be reduced to a collection of values—one
for each of these. The machine-learning
system will train against this feature
vector.
The second interface we need is a way
to control which optimisation passes the
compiler runs. While at the command
level, these correspond to optimisation
flags, at the programming level the
Compiler plugin
MAGEEC
Machine learner
Feature
Extractor
Pass
Gate
compiler has a pass manager, which
decides which pass should run. MAGEEC
source benchmarks suitable for embedded
systems, so we created the Bristol/
Key criteria were that the infrastructure should
be generic, that it should optimise for energy,
that it should be based on real measurements.
uses a plugin interface to the pass
manager to override which pass should
run.
MAGEEC operates in two modes. In
training mode, it collects data that will
be used to drive the machine-learning
system. A very large number of data is
needed from a large and representative
set of programs, which are then run
with a wide range of different compiler
optimisations to measure how the
optimisations affect energy. A key
challenge was the lack of free and open
Embecosm Benchmark Suite (BEEBS),
which comprises 80 programs with
minimal system dependencies. It is freely
available from www.beebs.eu.
Selecting which optimisations to use
when building up the training data is a
huge challenge. GCC, for example, has over
200 optimisations, many of which affect
each other, so we can’t just try each one
in isolation. Yet to run 2,200 tests would
take millions of years, so we turned to
experimental design theory. MAGEEC is
one of the first computer applications to
make use of fractional factorial design to
optimise the number of measurements
to run. This is combined with PlackettBurman design to identify the most
important optimisations to evaluate,
reducing the number of runs to a few
hundred, which can be completed in a few
hours.
The output of all these tests is a large
data set, from which we can construct
a database correlating source code
with the best optimisations to use for
energy efficiency. MAGEEC is deliberately
independent of any particular machine
learning algorithm. By default MAGEEC
uses a C5.0 Decision Tree, but others
in our team have explored the use of
generic algorithms and inductive logic
programming.
With the machine-learning database,
MAGEEC can be switched to its default
2015 DIGITAL LEADERS
123
Image: iStockPhoto
SOFTWARE DEVELOPMENT
mode. In this mode, when supplied with
a program, the pass manager plugin will
look into the machine-learning database
to determine which optimisation passes to
run.
We are in the early stages of evaluating
MAGEEC, but initial results using a small
training subset with the Cortex M0
processor have shown we can get a 3–7
per cent improvement in energy usage
over the best possible today. However, with
a fully trained dataset our objective is a 40
per cent reduction in energy usage.
There are a number of side benefits
that have arisen out of this project.
Energy is reasonably well correlated with
performance, so energy-efficient programs
typically run much faster than can be
achieved with standard optimisations.
A second effect is that the pass manager
is typically controlling a compilation of
individual functions, each of which gets its
own optimisation settings. This
fine-grained optimisation adds a further
improvement to the code generated.
Superoptimisation
We are all used to compilers ‘optimising’
code. However, the code is not truly
optimised, in the sense that it is the best
possible sequence of instructions to
perform the desired functionality.
Superoptimisation works by searching all
possible sequences of instructions to find
the best possible sequence.
The following example shows how
124
DIGITAL LEADERS 2015
CCode
Standard compiler
Superoptimzer
int sign(int n)
cmp 1 d0,0
add1 d0,d0
{
ble L1
subx1 d1,d1
if(n>0)
move1 d1,1
negx1 d0
bra L3
addx1 d1,d1
return 1;
else if(n<0)
L1
return -1;
bge
else
move1 D1,-1
return 0;
}
L2
bra
L3
L2:
move1 d1,0
L3
a simple C sign function is compiled
by a standard compiler and then by a
superoptimiser.
In each case the value is supplied in
register d0 and returned in register d1. The
superoptimiser code is not at all obvious
although three of the instructions use the
carry (x) bit. As an exercise look at the
values in d0, d1 and x when the input value
is a negative number, a positive number
and zero.
This is typical of superoptimisers
finding code sequences that no human
coder is likely to see. The technique is old,
introduced by Henry Massalin in 19972, but
has always suffered from the immense
computational requirements of the huge
improvements in the search algorithms
mean some types of superoptimisation
can now be considered for commercial
deployment. One of the oldest
superoptimisation tools is the free and
open source GNU Superoptimiser (GSO).
This tool can typically optimise
sequences up to around seven instructions
on a modern PC. This may not sound
much, but if that code sequence is the key
inner loop at the heart of a major piece
of software, the returns are more than
justified.
More generally GSO can be used as an
amanuensis for the compiler and library
writer, identifying architecture-specific
peephole optimisations, and efficient
Superoptimisation works by searching all
possible sequences of instructions to find the
best possible sequence.
search space, and so has remained an
academic speciality.
Much research in the intervening 27
years has gone into improving search
efficiency. During the summer of 2014,
James Pallister of Embecosm and Bristol
University spent four months funded by
Innovate UK investigating whether any
of these techniques are now feasible for
commercial deployment (details at www.
superoptimization.org).
The good news is that the advances
in compute power, combined with
implementations of key library functions.
The study also found that some
techniques will be commercially feasible
with a relatively modest investment in
further industrial research. Perhaps
most exciting is the use of Monte Carlo
methods to explore the search space
in the technique known as stochastic
superoptimisation. This can find much
longer code sequences, and remarkably
those code sequences may represent
completely new algorithms - in other
words code sequences that could never
be found by a conventional compiler’s
stepwise refinement of code. In a paper on
the subject3 a team from Stanford used
this approach to discover a completely
new implementation of the Montgomery
multiplication benchmark.
Embecosm is now working with James
Pallister on a new version of the GNU
Superoptimiser that will add stochastic
superoptimisation.
Some of the leading work on compiler
optimisation is being carried out in the UK,
with contributions from both industry and
academia. The output from this work is all
open source, meaning the technology can
be quickly and widely adopted.
References
1 James Pallister, Simon J Hollis and
Jeremy Bennett, Identifying Compiler
Options to Minimize Energy Consumption
for Embedded Platforms. The Computer
Journal 2013, published online 11 Nov
2013, doi:10.1093/comjnl/bxt129.
2 Henry Massalin. Superoptimizer: A
Look at the Smallest Program. Second
International Conference on Architectural
Support for Programming Languages and
Operating Systems (ASPLOS II), pages
122–126, 1987.
3 Eric Schkufza, Rahul Sharma, Alex
Aiken, Stochastic Superoptimization.
ASPLOS ‘13, March 16-20 2013, Houston,
Texas, USA.
2015 DIGITAL LEADERS
125
STRATEGY
need to invest in IT. For fully cloud-enabled
businesses, the cost base is merely a
proportion of their revenue figures; for
other businesses matching investment
appropriate to current and future business
demand is a constant challenge. Whether
in-house or outsourced, IT has become
a supply chain, servicing the application
economy.
A supply chain has several
characterising features, and the IT
supply chain is no different. It can be
characterised by an integration of
services to form a business-relevant
capability. These services can be broadly
characterised as fitting the following
model:
Image: iStock/477330603
•
•
•
EFFICIENT SUPPLY CHAIN
Robert Limbrey MBCS explains how driving an efficient, resilient IT supply chain can improve an
organisation’s standing on the global stage.
We live in an application economy. As the
world becomes increasingly connected,
leading corporations are using a variety of
web and mobile applications to reach their
customers, promote their brand and
conduct their business. The digital channel
has become an important differentiator –
companies that connect with their customers
better perform better.
In this application economy, the cadence
of change is high as agile business
responds to customer sentiment. Hence,
IT executives find themselves sandwiched
between these powerful and unstable
business forces and a more traditional,
less agile supply chain – a chain that
typically has many layers of redundancy
126
DIGITAL LEADERS 2015
built in, in order to resiliently service these
demands.
At the same time, the consumerisation
of IT has empowered a new breed of
executive – one that is equally comfortable
in commanding both IT and business
strategy.
It’s these executives, unscarred by
failed implementations of complex
ecosystems that demand faster, better,
stronger solutions. This shows how
quickly complexities can be dismissed
if they happen ‘in the cloud’, and how
much easier it would be if services were
purchased off-the-shelf in the manner of
insurance or telephone contracts.
This consumerisation has fueled
a recent trend towards outsourced
technology services, which has led to
a new breed of ‘cloud-based’ suppliers
who are able to challenge cost-inefficient
legacy models with scale and flexibility. To
the executive these services are utopian
in allowing their business to scale, whilst
masking the complexity in maintaining and
delivering a resilient service.
Yet, can we ever outsource the
resilience problem completely?
As applications are released and
business patterns change, demands on
infrastructure – whether outsourced or not
– are entirely related to business cycles.
Demand volumes, an entirely businesscontextual metric, are ultimately creating a
pplications;
a
infrastructure;
data centres.
What used to be called ‘capacity
management’ has grown into something a
lot more strategic. Right-sizing the
supply chain combines an understanding
of demand, configuration and financials
to deliver a function that ensures service
right-sizing, headroom management and a
steely-eyed focus on cost-efficiency.
This new breed of IT management
practice brings together an empirical
understanding of the technical domain, with
a contextual understanding of business
demands to the procurement of IT services,
whether externally sourced or not.
It follows that, no matter what your role
in IT operations, being able to right-size
capacity and configuration is crucial to be
able to demonstrate value to the business.
Whether your domain is managing data
centres, IT infrastructure or dealing with
applications, if right-sizing is not a relevant
activity for you, then it certainly will be for
your competitors.
If you work for an outsourcer, your
ability to right-size will allow you to
compete more effectively in the market,
offer improved quality of service, have
better operational control over your
margins and be a better business partner
to your customer. And if you work for
in-house IT, guess what? The benefits are
exactly the same.
Three fundamentals exist in right-sizing
the IT supply chain:
1. You must gain a good understanding
of demand. Without this, you have
no understanding of the upstream
activity that is driving your business.
Contextualizing this is key, so that
relevant metrics are identified and
used for monitoring and forecasting.
Whilst empirical data will provide a
baseline for forecasting, it’s imperative
to get a closer understanding of the
business in order to understand the
projections and projects that represent
growth or potential step-changes in
demand.
2. You must build exceptional insight of
configuration and how that constrains
and defines the available capacity.
When correlated against performance
monitoring data, the resulting insight
will help to ensure that you maintain
an optimised environment, and one
that can provide the resilience that
your customers need.
3. Everything must be contextualized
against the financial model – the
operational context that describes how
services are procured, under which
service levels and at what costs. The
right-sizing function must be
accessible to key stakeholders, providing
insight into the alternative investment
decisions they will be faced with and
empowering them to take decisions
about cost and resiliency.
understand the value of supply chain
management. Consider Zara, who
minimise the amount of stock they hold in
warehouses – their model allows them to
have low overheads and quickly respond to
changes in fashion.
This model has given them a huge
competitive advantage over their ‘large
warehouse’ competitors, whose overheads
protect them against long delivery cycles
and under-stocking. Zara is a lean-andmean operator; survival is completely
dependent on agility. If you can accurately
forecast demand and have an agile
supply chain, then you can outclass your
competitors in time-to-market, costefficiency and quality of service.
Industrialisation of IT supply chain
management is equally important. During
the industrial revolution, Adam Smith, the
pin manufacturer, realised that production
could be significantly improved if individual
tasks could be assigned to specialists and
brought together into an integrated product
at the end of the process.
We can now recognise the IT supply
chain is a pin factory, and jobs such as
hosting can be delivered more effectively
if done by specialists, and then integrated
to deliver applications that perform at
scale. The cloud marketplace – where
these services are delivered with agility
– generates healthy competition between
internal and external specialist suppliers.
2015 will be an incredible year of
opportunity for the application economy.
Just don’t ignore the effect of driving an
efficient, resilient IT supply chain to make
it happen.
References:
www.businessweek.com/articles/201311-14/2014-outlook-zaras-fashionsupply-chain-edge
Pathways to Industrialization and Regional
Development, 2005, Routledge
Study any large retailer, and you’ll quickly
2015 DIGITAL LEADERS
127
STRATEGY
Image: Digital Vision/83313318
COMMUNICATING GLOBALLY
The globalisation of companies and social networks has meant that as well as big corporates, small and
medium companies can take advantage of global services. This means having the skills and strategies to
enable effective global communications are an absolute necessity. With this in mind George Budd MBCS
provides some easy to implement strategies that will support individuals and teams to become effective
in a rapidly changing global environment.
Clear verbal communication
First and foremost, clear verbal
communications are vital. The communication
medium is not as important as ensuring
the message is understood by the listeners.
It would appear that colleagues in Asia and
South America often develop their English
skills based on US television and films.
The result is that those of us who
have learnt English in the UK are at an
immediate disadvantage. However, there
are two main strategies that can help
enhance communication:
The first is the simplest, and that is to
speak as slowly as possible and ask the
listener to confirm understanding of key
points. The second is to seek specialist
training to improve verbal communications
128
DIGITAL LEADERS 2015
with those from different backgrounds.
Time zones
We are now speaking clearly and everyone
in the world can understand our verbal
message. So we set up a regular global
team meeting at 1pm GMT so that the US
team members are available.
But for our team members in Hong
Kong the meeting is at 9pm HKT. With the
best will in the world we cannot expect
our colleagues in Hong Kong to be at peak
performance at 9pm HKT after having
been on the go since probably around 7am
HKT.
So how can we avoid this?
The main strategy is to consider carefully
the necessity of such global meetings and
keep them to a minimum.
For every meeting be mindful of the
participants’ location and make an effort to
ensure it is at a reasonable and convenient
time. Another useful strategy is to start
work early in the UK on a weekly basis
and use this time to have meetings with
Asia who will be at their best and also
appreciate the effort you have taken.
Email over use and instant messaging
Email is a useful communication medium
and has been a trail blazer. However, it
seems to have now become considered
as actual work. There appears to be an
implicit assumption that individuals have
to be seen to respond to mails quickly.
Team members are aware they have no
control over who views the mail so it has
to be worded very carefully, and this often
results in hours spent writing one email.
The cycle continues; everyone that
receives that mail thinks they have to be
seen to be replying and so it goes on and
on. An additional factor is that a large
percentage of the time, the message in
the mail isn’t sufficient and leads to a
mail chain wasting time getting anything
understood.
All this time spent checking the mail
system could be much better utilised. In
fact there is also no guarantee that the
message is understood or agreed with at
all.
Therefore always consider when to
use email and when not to and carefully
think about whether you really need to ‘cc’
at all. I have found that rather than firing
off a mail, a quick video call or real-time
message is far more effective and usually
the subject matter can be agreed on the
spot.
So adopt the strategy and use instant
messaging for getting a quick response
to your enquiry or to ensure someone
has received the message. Most instant
messaging systems enable you to see if
the recipient is online.
Video conferencing
I have found the most effective medium
for carrying out complex global
communications is video conferencing.
Traditional teams that are based in a
single location have the advantage of
developing good interpersonal
communication.
This helps understanding amongst
the individuals, which supports the team
development. If we are speaking to
someone in the traditional office setting,
we are able to read nonverbal cues and
see how the message is being received.
However, in global teams, using only
telephone conferences, the majority of
this is lost. Quite often on global voice
conference calls there are the proverbial
tumble weed moments where you don’t
know if anyone has heard your message or
even if they are still there.
Using video conferencing for global
meetings means that you can see who is
on the meeting and whether they are being
attentive or distracted. It is also possible
to see the body language of participants in
response to your message.
That said, it is important to highlight that
video conferencing can be a double-edged
sword. As well as being able to see the
participants they can also see you! So if
dealing with a challenging team member,
everyone can observe your responses as
well.
A critical point here is that video
conferencing needs to be utilised in the
same way as if it was a direct face-to-face
meeting. Remember that not only body
language, but your attire and surroundings
are also visible.
A little preparation and prior planning
will stand you in good stead on a video
conference. Ensure you are dressed
appropriately to support your professional
image.
Check the image and focus of your
camera prior to the meeting and make
sure your head and shoulders are clearly
visible. It is really distracting talking to
the top of somebody’s head. Check that
the background is appropriate and not
distracting.
Being on a video conference is not unlike
being a news presenter. To improve your
performance on video watch how news
presenters operate and communicate and
how they look at the camera the majority
of the time so it looks like they are engaged
at a personal level.
It is difficult to gauge everybody’s
responses but looking at the camera rather
than the screen for key messages is a
good start. At all times present an attentive
demeanour and emphasise that you are
listening and attending. Remember to
encourage and agree the dialogue by the
use of minimal encourages, i.e. nodding,
OK etc. It is difficult on a video conference
to keep interrupting with appreciative
statements, which increases the
importance and effectiveness of positive
body language. There are a number
of books available on TV presentation
skills that can provide further hints and
guidance.
Further to the earlier comments about
practising verbal communication with
global team members, having regular oneto-one video conferences with key team
members is a great strategy. Furthermore,
it is useful to use these slots as a
completely or partially informal chat.
This can help you get to know your team
and colleagues better. To build rapport, it
is useful to put an item of interest in the
background to prompt discussion and
initial engagement. It is impossible to get
to know everyone personally but taking
the effort for key personnel will deliver
much benefit. Encouraging others to do
this distributes the benefit of improving
communication and understanding
globally.
Always consider how you are being
heard by someone whose first language
is not English, and at a minimum, slow
down to aid understanding. Practice verbal
communications with colleagues in other
countries and ask them what they find
hard to understand.
Make the effort to have meetings at a
reasonable time for the location you are
communicating with. It is always well
received if you start early on occasions to talk
to someone during their normal office hours.
Avoid email as it leads to wasted
time and the message can easily be
misunderstood or lost. Use instant
messaging for quick responses and finally
use video conferencing wherever possible,
but be aware of your body language, attire
and surroundings.
2015 DIGITAL LEADERS
129
Image: iStock/477330603
STRATEGY
CIO DRIVING SEAT
2014 was another challenging year for the CIO with plenty of column inches given over to debating
the control and usage of technology across the enterprise with much speculation about the validity of
the role itself. What is critical right now is that CIOs need to evolve in 2015 and show their true worth
in helping set the strategic direction of their organisations. Christian McMahon FBCS explains why he
thinks CIOs need now to be in the digital driving seat rather than being a passenger.
Those CIOs who are more commercially
astute and who have a capacity to add
tangible value to their business will excel,
but those who are not will most likely be
sitting in a different chair at the start of
2016.
As a result of the recent economic
turmoil and rapidity of change across
the commercial landscape, many
organisations are now looking for a
different type of CIO or technology leader
than they have had in the past. They are
shifting the need for a more technically
130
DIGITAL LEADERS 2015
focused individual to one who is able to
unravel the complexity of IT, increase the
accessibility to technology and be open
to new ideas with the ability to work with
peers on getting the right things done.
One of the key factors in this
evolutionary change in the CIO role is the
need to understand and appreciate that
they no longer have ultimate say over
what technologies are used within their
organisation, but they will still be held
accountable for making sure it all works.
Gartner research has shown that 38 per
cent of IT spend is already outside of IT
and this is expected to reach 50 per cent
by 2017. This is going to send a shiver
down the spine of many a CIO, but it shows
the need to understand the diversification
of technology usage and requirements
across their organisation. This is quite the
culture shift for many who have migrated
into the CIO role from a traditional ‘lights
on’ IT director role of old.
For too long CIOs have been identified
as the strategic and commercial weak
link in the c-suite rather than as someone
who adds tangible value across the
business. CIOs must seize this opportunity
to transform their role and reputation into
one that thinks collectively, understands
how best to resolve the issues that matter
across the business and ultimately
delivers commercial value.
The main focus for many of us this year
is on how to transform into and drive a
digital business. This is exactly where the
CIO can step up and work with peers and
key stakeholders across the business to
define a strategy that is moulded around
a ‘customer first’ approach where digital
technologies will form the cornerstones of
how services are delivered and consumed
going forward.
This will require much managing of
change, process and incumbent technology
and possibly require a marked change in
strategic direction – a role tailor-made for
the commercially astute CIO in harness
with the CMO.
The impact of digital business on
industries and individual organisations
cannot be underestimated and Gartner
have predicted that by 2017 one in five
industry leaders will have ceded their
market dominance to a company founded
after 2000.
This is a bold claim, but one that I
support as you can no longer rely on
historical dominance of your sector – either
embrace disruption now or start planning
your burial in the corporate graveyard
alongside luminaries such as Kodak and
Blockbusters.
CIOs must embrace a Bi-Modal IT mindset where they simultaneously embark
on the digital transformation journey
whilst maintaining business as usual
(BAU) services. It’s no secret that the most
successful CIO’s are those who are able to
run the business and transform it at the
same time. Many industry observers and
consultants will tell you that they have
witnessed more transformation in the last
three years than in the previous 20 years
combined, so this shows how important
these skills are in the modern CIO. I don’t
see any lessening in this pace as the
demand for new and simpler ways to
consume data, information, products and
solutions is only going to increase year on
year as the technology and accessibility to
it improves.
CIOs will also need to start concentrating
on what talent to bring into their
organisations this year to manage this
‘bi-modal IT’ approach as the market for
the best talent is already stretched and
growing ever more taut.
CIOs should help their business
colleagues and the CEO to think outside
the box to imagine new scenarios for
digital business across companies and
industries; this will provide a great
opportunity for CIOs to amplify their role in
the organisation.
CIOs need to help create the right mindset and a shared understanding among
key decision makers in the enterprise – to
help them ‘get’ the possibilities of digital
business. They must take a leadership role
in helping their organisations change their
mind-set to what’s possible – and what’s
inevitable in a digital business future.
It’s going to be a busy year with a fair
amount of turbulence, so buckle up and
enjoy the ride.
2015 DIGITAL LEADERS
131
STRATEGY
Image: iStock/477330603
opportunities to improve the way IT is
delivered?
It’s easy for an IT strategy to be over-focussed
on the new technologies and the projects
that IT will be delivering for the business.
It is also important that IT takes a look
at itself and assesses whether there are
opportunities to improve how it operates
and where service quality can be improved
or costs reduced.
This may include considering areas
such as moving services into the cloud,
making organisational or governance
changes or implementing better
processes. This helps to demonstrate IT’s
commitment to continuous improvement
and to delivering efficiencies to the
organisation as a whole.
STRATEGIC OPPORTUNITY
Adam Davison FBCS, Managing Consultant, Field Day Associates, discusses why an assessment of the
opportunities and threats is an essential final step toward the creation of a sustainable IT strategy.
Most IT strategies start at the same point:
obtaining a clear understanding of the
business requirements and then assessing
the strengths and weaknesses of the
current IT situation against them.
This is logical and well understood.
However this, of course, constitutes
only half of the well-established ‘SWOT’
analysis. What is sometimes missed
is the carrying out of the second half
of this analysis; an assessment of the
opportunities and threats resulting from
the proposed strategy.
Every organisation is different and
therefore the detail of every IT strategy
will be unique. Nevertheless, it is possible
to highlight areas of opportunity and risk
132
DIGITAL LEADERS 2015
that are commonly found and these can
act as a starting point for this analysis.
Considering these points will not only help
ensure that the strategy has considered
all the significant issues, but should also
significantly reduce the likelihood of the
strategy failing to deliver.
Opportunities
Does the strategy bring new ideas and
opportunities to the organisation as a
whole?
The process of developing an IT strategy
tends, by its very nature, to be rather
reactive, with the focus on assessing what
the organisation wants and then defining
(albeit possibly in innovative ways) how
it will be delivered. However, as part of
the process, it is also worth taking a look
at potential opportunities that IT might
be able to bring forward to the business
rather than as a result of what the
business says it wants.
Typical examples of this might include
opportunities around the internet of things
or developments in big data where the
thinking in the IT world is frequently of the
mainstream. Illustrating such examples
may not be core to the IT strategy but it
does serve to demonstrate IT’s engagement
with the organisation as a whole and the
potential IT has to deliver added value.
Does the strategy fully embrace the
Does the strategy include an assessment
of opportunities for the CIO to improve
their or their department’s position?
‘What’s in it for me?’ is a question people
can be awkward about asking, but it is
nevertheless a valid one.
There is certainly nothing wrong with
a CIO considering whether the strategy
could open up opportunities for them to do
their job more effectively and to increase
the contribution that they can make to the
organisation. The sorts of topics that might
fall into this category could include:
•
•
hould the scope of the CIO’s
S
responsibilities be broadened to
formally include, for example, business
change (relevant if the organisation is
embarking on an aggressive change
programme and this area is not
covered elsewhere)?
Should the CIO have more input
into business strategy development
(where the IT strategy highlights a
strong interdependency between the
business plans and IT)?
Given the nature of these questions it may
be appropriate for any detailed discussion
on them to be carried out outside of the
overall IT strategy process. Nevertheless,
the development of the new or updated IT
strategy presents an ideal opportunity for
them to be raised.
Risks
Does the strategy align to the culture of
the organisation or consider how cultural
barriers to its delivery can be overcome?
Cultural change is often a weak area for
organisations. The IT organisation can fall
into the trap of saying: ‘we just deliver the
technology; it’s up to the business to work
out how to use it’, but this leaves huge risk
that a strategy will fail, often with IT (fairly
or not) being blamed.
It is therefore vital that likely cultural
obstacles are highlighted as early as
possible and it is made clear whether
addressing them is or isn’t seen as part of
IT’s job.
For example, a mobile working initiative
is at least as likely to be unsuccessful
because of resistance amongst staff to
working that way or lingering ‘traditional’
management practices, as from technology
problems. Similarly, an organisation with
a ‘seat of the pants’, gut instinct, decisionmaking culture will struggle to derive
benefit from a major investment in data
analysis, however good the technical
implementation.
Has the strategic vision been limited by
constrained thinking?
Confirmation bias is a fascinating aspect
of psychology, dealing with the inherent
human tendency to seek or give preference
to evidence that supports their existing
beliefs.
The whole scientific method has been
developed to overcome this behaviour.
Unfortunately, IT strategies, although they
should always be logical, are rarely totally
scientific.
It is, therefore, vital to stand back once
the strategy has been drafted and attempt
to take or obtain an unbiased view on what
has been proposed, for example, through
an independent review. If, for example, the
option of moving to the cloud has been
rejected, it is worth asking the question:
is this really because it is not appropriate
to the needs of the business, or has the
thinking been clouded by discomfort with
the impact this could have on the role of
the CIO and the IT department?
It’s not easy, indeed it’s probably not
possible, to be completely open-minded
and dispassionate about these issues but,
if an IT strategy is to deliver real value, it is
vital to try to play devil’s advocate and to
be open to challenges in this area.
Is the plan for the implementation of the
strategy realistic?
IT strategy development is a creative
process and, in defining all great new
things that are planned, it is very easy to
get carried away.
It is, therefore, always important,
before the work is completed, to really
consider whether what is planned has both
adequately covered all the issues and is
achievable.
Strategies often propose more change
that an organisation can absorb in the
time planned, which inevitably leads
to failure. Another vital area is security
where the scale of the effort needed to
address the various threats, both ongoing
and related to specific initiatives, is often
underestimated.
The specific issues raised in this
article are of course not exhaustive
and, depending on the nature of the
organisation, many others may exist.
However, whatever the risks and
opportunities identified, an assessment
of the opportunities and threats is an
essential final step in the creation of an IT
strategy.
2015 DIGITAL LEADERS
133
STRATEGY
CONNECTING
THE DOTS
Image: Digital Vision/83826961
Why is it that the information technology industry is not attracting
sufficient numbers of new entrants despite some great initiatives,
such as the Cyber Security Challenge1, doing amazing things to
bring awareness of the issues and opportunities to those outside the
industry? Estimates vary on the scale of the problem, but most
commentators seem to agree that the gap between supply and
demand is growing. Oscar O’ Connor FBCS reports.
One of the reasons I believe we have a
skills shortage across many key disciplines
is that our teenagers only ever see
negative reports about what we do.
Hearing only of failure makes our
industry unattractive, and apparently it is
particularly unattractive to young women,
with only 15 per cent of the ICT workforce
being female2.
You might point to eBorders or the
BBC’s Digital Media Initiative, the steady
stream of suppliers quitting the NHS’s
NPfIT as headline-grabbing examples of
how not to deliver. If we want to tempt
the next generation away from banking,
accountancy, law or any other discipline we
need them to see our industry as a place
where it is possible to do great things and
to be successful.
So why do we fail?
My conclusion, from more than 20 years’
experience across numerous sectors and
disciplines, is that there is an unhealthy
combination of delusion and collusion at play.
This serves nobody well; it leads to poor
decision-making and the kinds of project
failures, cost and time overruns that
happen too often to the detriment of our
collective reputation.
Looking to the future of our industry, on
which we, the professionals, rely for our
income and the rest of society relies on for
almost every aspect of modern life, we need
134
DIGITAL LEADERS 2015
to attract more young, talented people.
Fixed price?
Those of us long of tooth and grey of hair will
all recall the project manager’s dilemma of
balancing cost, time and quality… on the
surface a fixed price and fixed time
approach can be very attractive as a means
of identifying a supplier with whom a
customer can work. It levels the playing field
so all suppliers are bidding against a set of
relatively well-defined criteria.
However, as an approach to managing a
project, it is a car crash waiting to happen
when circumstances prove to not be quite
as defined in the original invitation to
tender – and they never are.
Who mediates when acceptance of
deliverables is dependent on a meeting
of minds as to what ‘fit for purpose’ or
‘acceptable’ means? If a customer wants
an additional version of a document
deliverable for any reason, does that
constitute a contract change? Who was
responsible for there being a requirement
for an additional version?
On a recent project3, this occurred
regularly and on each occasion the
customer’s stated view was that the
additional iteration was required to bring the
document up to the required level of quality.
But if the supposed quality issues are
identified by a reviewer added to the list
of contributors for the final review, is it
The ‘fixed price delusion’ – that it
is possible to obtain a high-quality
outcome from a fixed price contract,
on time and on original budget.
Buyers of services, systems, software
and so on rarely define with any great
accuracy what they want to buy, yet
expect suppliers to commit to fixed
timescales and fixed (low) costs for
delivering the ill-defined solution.
Suppliers collude with this delusion
by agreeing to be held to the costs
and timescales they propose…
until the first new or changed
requirement is specified, which
triggers a renegotiation of the cost
and time scales…rarely if ever in the
customer’s favour.
reasonable for the supplier to incorporate
their comments at no additional charge to
the customer, assuming that the comments
warrant inclusion? Relationships can and
do break down over accumulations of such
apparently minor issues.
By all means use the fixed price concept
for tendering purposes, but let us please
stop fooling ourselves that any project
exists in the kind of vacuum required for
such a project to be delivered successfully
for all parties.
Need to Know?
There are, of course, many sound business
and/or security reasons for not sharing
everything about your business with your
suppliers.
But consider that when you fail to
inform a supplier of something that
materially impacts their ability to deliver
something you have asked for, you give
them a justified opportunity to renegotiate
the terms under which that deliverable is
to be produced. I have worked on projects
where the customer routinely failed to
inform my project of the other activities
that depended on our outputs, except of
course when a delay to our project caused
a delay to another activity.
A project delivers an output to its
customer’s organisation, but for that
output to deliver the expected outcome
it needs more than a delivery note and
an invoice. We seem to have developed a
culture where the ‘need to know’ principle
is being applied with rather too much
enthusiasm and inappropriately.
Projects organised into portfolios and
programmes stand a far higher chance
of succeeding, at least partially, due
to the requirement for cross-project
communication and collaboration that are
inherent in the management of portfolios
and programmes.
Who delivers?
In contracting with suppliers, all too often
the customer’s team are less experienced
than the supplier’s pre-sales team and can
be led to conclusions that suit the supplier
The ‘Snake oil delusion’ – that one
supplier’s solution will address every
aspect of the customer’s requirement.
Suppliers regularly make claims for
their solutions in order to exclude
other suppliers from taking part in
a project when it is rarely the case
that a single piece of technology will
provide a full end-to-end solution.
Customers collude with this delusion
by accepting such claims and making
ill-informed decisions that their
organisations will have to live with for
potentially decades to come.
more than they might the customer.
Then when the contract is awarded,
the supplier often hands the responsibility
to a project delivery team who have not
worked on the pre-sales stage and have
less experience and knowledge than their
pre-sales colleagues.
Communication within organisations
is always a challenge and the separation
of pre-sales and delivery responsibilities
is one of the key areas that needs to be
addressed in supplier organisations if
the reputation of this industry is to be
improved.
My own experience, running a multimillion professional services division for
one of the major computer manufacturers,
is that breaking this barrier was more
effective than providing training or
incentives to improve communication.
When the people involved in making
the sale know they are also going to be
accountable for the delivery, there is a
subtle change in mindset that makes them
think farther ahead than just getting the
purchase order.
This results in better project outcomes
and happier customers, which builds trust
and directly leads to further business. More
business creates more opportunities for
new people to come in and experience the
success, which creates brand and industry
ambassadors.
As long as the negative behaviour
continues, projects will continue to cost
more than originally announced, results
will continue to be worse than required
and we, the technology supply chain, will
continue to struggle to attract new talent
into an industry with a very public record
of failure.
To address the skills shortage we need
to make our industry attractive again and
it is going to require more than a cosmetic
make-over.
Alternatively, we can work together
as an industry with our customers and
competitors to change the culture in
The ‘silo delusion’ – that a project can
be successful without knowledge of
the wider context in which it is taking
place, such as the overall business
conditions and other projects that
can introduce new technologies or
restructure the organisation.
Customers rarely provide suppliers
with all of the information they need
about other activities taking place
across the organisation that depend
on, or will have some other impact on,
the suppliers’ project.
Suppliers accept this wilful ignorance
in the hope that there might be
money to be made through change
control when the ‘unknowns’ become
‘knowns’.
which technology projects are bought,
sold and delivered, with a greater
emphasis on effective governance, honest
communication, seeking success for all
concerned with excellence in evidence in
all activities.
It can be done. I have the evidence4 to
show that making these changes can turn
failure into success, convert losses into
profits, make customers and stakeholders
happy and build teams that are proud to
work together.
References
1. www.cybersecuritychallenge.org.uk
2. Source: Mortimer Spinks/Computer
Weekly Technology Industry Survey 2014
3 …with a European Telecommunications
organisation
4. 2004-2006 UK professional services
division of US-based computer hardware
manufacturer: project success rates from
<50% to >95%, unplanned expenditure
from >$2m to <$75k, turnover from $12m
to $43m, team from 25 to 90.
2015 DIGITAL LEADERS
135
STRATEGY
Image: iStock/463186601
in the subject, but having seen this lack
of encouragement for most of my school
life, I and most of my female friends did
not feel confident or interested enough
to pursue “geeky” technical aspects of IT,
instead selecting other topics from the
extensive range of interesting educational
possibilities open to us.
‘My girlfriends and I were also
discouraged by media reports about the
male-dominated IT and technology worlds
in which women were liable to face
prejudice, lack of opportunity and “glass
ceilings”. All of these factors combined to
put me somewhat off IT as an academic
and career choice.’
WOMEN DIGITAL LEADERS
Jacqui Hogan MBCS deliberates on the current lack of female leadership within the information
technology industry.
When I first thought about writing this
article, I thought it would be easy to find a
wide range of female digital leaders from
whom to learn and to admire. However, it
has proved very difficult to find very many.
This got me thinking. Why are there so few
women at this level?
You would think that in these days of
technology infiltrating every aspect of our
lives, we would have women distributed
at all levels of IT. However, a recent
Computer Weekly report revealed that the
average percentage of women working in
technology teams in the UK is 12 per cent
– down from 15 per cent last year. Recent
research from Approved Index shows
136
DIGITAL LEADERS 2015
female leadership in new technology
start-ups at less than 10 per cent, and at
the very top, representation of women as
digital leaders is equally dismal. Sadly, this
is also reflected in engineering and, to a
lesser extent, science.
Is it that schools do not encourage the
pursuit of the more technical aspects
of IT for girls? Lack of knowledge and
experience within schools combined does
not help to create interest, never mind
excitement. A curriculum that focusses too
much on the superficial use of technology
rather than the creation of technology
has also failed to develop suitable skills
needed.
Chloe Basu, currently studying history
and international relations at St Andrews
University, says:
‘When I was growing up, it was often
viewed that computer games and IT
were pursuits for boys, not for girls – for
example there were large numbers of
“wham, bam, zoom and kill” competitive
and destructive games for boys. Teachers
were surprised if girls displayed an
interest in computers deeper than our
basic skills as daily users of PCs, laptops,
tablets, mobile phones and our other dayto-day technology.
‘On the one hand, such reactions made
a few girls determined to start an interest
Is it the geeky image portrayed in the
media and popular culture? Although we
do see many more female
technology characters in various TV series,
e.g. ‘Criminal Minds’, they tend to be just
as ‘strange’.
The recently released (and quickly
withdrawn) ‘Computer Engineer Barbie’
book shows how confused we are about
the role of women in the digital world.
While you might think it was a positive
move to have a popular girl’s toy depicting
a female computer engineer (thus hitting
two birds with one stone), in practice it
only served to reinforce that women still
need men to do the hardcore coding of
software development.
It was perhaps gratifying that there was
such an outcry that the book was quickly
withdrawn.
Perhaps it is the so-called ‘macho’
environment in many IT departments and
companies holding women back from
progress. A working environment that
belittles women and treats them as
figures of fun is not going to attract many
women.
Nevertheless, other largely male working
environments, (e.g. construction) have not
seen the same failure to attract women.
What is it about IT that is different?
Some say that our concern should be
less about the lack of women in IT, but
more about the lack of interest in an IT
career across the board. My own view is
that they are part of the same problem.
Lack of diversity creates a feeling of
exclusivity in a culture that makes it less
attractive to anyone outside of the core
group. The core group thus becomes more
exclusive and more entrenched in its
attitude.
The more girls see and experience a range
of different and positive female technology
role models, from female digital skills
teachers, through senior software developers
to technology CEOs, the greater the attraction
for them to take the first-step on the path to
becoming great digital leaders.
Keeping all this in mind, who are those
rare female role models, and how have they
beaten the odds to become digital leaders?
up of the people who use our products.
That’s not true of any industry really, and
we have a long way to go.” (USA Today,
August 2014) Sheryl Sandberg
‘I’m not a woman at Google; I’m a geek
at Google. If you can find something that
you’re really passionate about, whether
you’re a man or a woman comes a lot less
into play. Passion is a gender-neutralizing
force’ – (CNN, April 2012) Marissa Mayer
Martha Lane Fox, now Baroness Lane-Fox
of Soho, is probably most famously known
for starting Lastminute.com during the dot
com boom (and crash) of the late 1990s.
More a businessperson than a traditional
‘techy’, she followed her passion for the
use of technology to become the UK’s
Digital Champion. She stepped down from
this responsibility recently to concentrate
on the charity Go ON UK which focussed on
making the UK the world’s most digitally
skilled nation. In her maiden House of
Lords speech, she highlighted the lack of
digital skills at the top of almost all
corporates, public and political
organisations.
With other rising stars like Emma
Mulqueen, who has organised the Festival
of Code for six years, and Justine Roberts,
founder of Mumsnet, maybe the situation
is improving. This, together with the
increased emphasis on digital literacy in
the school curriculum, perhaps means we
can look forward to the rise of more worldclass female digital leaders.
Watch this space.
Marissa Mayer, CEO of Yahoo, was the
first female engineer at Google and their
20th employee, rising through the ranks to
executive roles including Vice President of
Search Products and User Experience.
During her time at Google, she taught
introductory computer programming at
Stanford, winning several teaching awards.
Today, she is the CEO of Yahoo! at the age
of just 39 years. In September 2013, it was
reported that the stock price of Yahoo! had
doubled over the 14 months since Mayer’s
appointment.
‘Endless data show that diverse teams
make better decisions. We are building
products that people with very diverse
backgrounds use, and I think we all want
our company makeup to reflect the make-
Sheryl Sandberg, COO of Facebook, also
worked for Google, joining them in 2001 as
Vice President of Global Online Sales and
Operations. She went on to join Facebook
in 2008 as their COO. It was her job to work
out how to make this brash newcomer
profitable. In 2010 she succeeded and in
2012 she became the eighth member (and
the first female member) of Facebook’s
board of directors.
‘Technology can break down the
barriers. I wish we could break down more
social divides using technology.’ (Growing
Business, November 2010) Martha Lane
Fox
2015 DIGITAL LEADERS
137
STRATEGY
CHANGING WORLD
Image: iStock/477330603
Bruce Laurie and Stephen A. Roberts, University of West London, School of Computing and Technology,
discuss the challenges and opportunities inherent in having an IT strategy in an ever changing world.
That ICT strategy needs to be aligned
to business and organisational strategy
seems obvious but it is not always common
practice.
For instance the NHS tried to centralise
IT patient administration systems at a time
when it was a decentralised organisation.
Both strategies were individually valid:
in the NHS individual organisations
experienced some noteworthy IT failures
and decentralisation to self-governing
foundation trusts was a recognition
that over one million staff could not be
managed from Whitehall; however the
pursuit of both strategies simultaneously
was a key factor in the eventual
abandonment of the National Programme
for ICT.
What constitutes a business strategy is
now also more complex. In any competitive
market the concept of a fixed strategy
138
DIGITAL LEADERS 2015
that remains fixed for a period of several
years while the IT supporting strategy
is rolled out seems a distant dream.
Increasingly businesses are moving
from a deterministic ‘fixed’ strategy to a
more ‘probabilistic’ approach: the general
direction is clear, but within an envelope
of possibility. Increasingly businesses are
seeking a number of ‘strategic thrusts’
in the general direction of travel and
optimising the situation as they go along.
At the same time the prolonged
economic downturn has challenged
assumptions about businesses always
growing. Assumptions about the benefits
of integration also have to be challenged
in a world of buying and selling companies
and of demergers as much as mergers.
Companies are also trying to collaborate
along the supply chain and in alliances
in a rich sense, including joint design
teams, and they can be competing and
cooperating at the same time in so called
‘Co-opetition’.
The systems scene
In this type of economy IT is faced with
four big issues:
1. If an organisation has spent several
years putting in enterprise resource
planning (ERP) or customer
relationship management (CRM)
systems, these are not going to be
replaced in a hurry. They are the
foundations of the business processes,
necessary to operate. If a competitor
has similar software, the differentiation
will come not by transactional efficiency
- just a given - but by better use of the
information that can be extracted from
these databases.
2. Transactional systems increasingly
need to be integrated along the supply
chain and this requires a more
collaborative approach to systems
and architecture.
3. Internally controlled data and
information is no longer sufficient.
Managers need good internal
information but increasingly also
external information, particularly at
the highest levels in an organisation.
An area manager may primarily be
interested in the performance of the
units under their direct control but
the board will be as concerned about
their competitors’ performance, socioeconomic trends and market
developments. This implies a
convergence of traditional IT data
management skills with those of
wider information science. We must
become used to dealing with data
from a wide range of sources.
4. During the 20th Century the role of IT
was to support part-automation of
clerical and similar repetitive processes.
The business driver was efficiency: less
clerks. The methodologies used owed
much to work study practitioners of
the earlier part of the century, and the
underlying assumption was that
processes could be automated if only we
studied them carefully enough. Iterative,
prototyping methodologies have enabled
a dialogue with the users in the design
process and more rapid deployment,
but fundamentally focused on relatively
straightforward applications. While there
are always areas for greater efficiency,
for many businesses that phase of
‘computing for clerks’ is now on a long
replacement cycle as underlying
technology improves.
The changing role of CIO heralds a
different purpose for IT in the 21st Century of
developing ‘computing for professionals’. IT’s
aim is to assist the professional in doing
their job more effectively. For instance the
use of 3D design programmes have
enabled engineers throughout Europe to
collaborate in building component parts
of an Airbus aeroplane so that it all fits
together, can be wired and assembled and
still compete with Boeing.
The resultant ‘application’ is part IT, part
human and the interaction more complex.
There needs to be mutual respect and risktaking by professionals and IT alike, and
real ownership by the professionals of the
overarching hybrid process. Work-studylike’ approaches to these problems have
a limited role. It is interesting to note that
while the national patient record system
floundered, the programme to digitise
imaging, X-ray, CAT scan and MRI has
been a conspicuous success: the central
team focussed on the overall deals with
suppliers and the high-level architecture;
local hospital IT teams and consultant
radiologists took local ownership and
developed new ways of working to
optimise the use of digital technologies.
Thus a consultant radiologist can examine
an image at home at the weekend or
overnight in the case of an emergency and
determine the precise course of action.
Such digital imaging has taken much of the
guesswork out of medicine.
The technologies
The ubiquity of the internet and the growth
of mobile networks have truly broken
down the confines of the office or factory.
For much of the world, especially in China
and Africa, people are neither a slave to
the copper wire, put in by the telephone
companies (going straight to radio), nor is
the PC as important for them as for us in
the West. The world-leading mobile phone
banking use is in Kenya, because there
are no bank branches in the villages and
mobile phone companies have become
trusted agents. The true implications of
‘always on anywhere’ are only beginning to
be understood, not least by regimes who
can no longer control information (such as
during the Arab Spring).
Our ability to store images as well as
traditional data has outpaced our ability
to use it effectively and we are only just
beginning to develop applications that
truly exploit our imaging abilities. For
instance most fashion websites give little
impression of how clothes will look on the
potential wearer, the buyer.
‘Big data’ is beginning to move from the
early adopter and hype stage to a mature
part of the infrastructure, but it is not yet a
mature technology. However organisations
will need far more analytical skills to
analyse and interpret big data.
Similarly speech recognition, which has
been around for many years, is maturing
by concentrating on specific lines of
business and thus reducing the vocabulary.
People talking to computers will become
much more normal.
The move to cloud hosting will almost
certainly continue at pace. Whether we
eventually think of computing power as a
utility like plugging in the electricity is not
clear, but we are building a flatter, more
homogenised global computing capacity.
This brings security fears and
unresolved tensions: we want our private
communications to be private, but we
want the government to protect us from
terrorism. Security will increasingly be
considered a fundamental aspect of
architecture rather than a supporting
feature.
Therefore aligning business and IT
strategy has never been more important
and the need for an enabling architecture,
often across businesses and along
the supply chain, is vital. We as IT
professionals have to join our business
colleagues in managing the increasing
uncertainty.
2015 DIGITAL LEADERS
139
STRATEGY
PRODUCTIVITY DRAIN
Image: iStock/477330603
Jeff Erwin, President and CEO, Intego and Flextivity, explains why implementing a policy of radical
transparency in your business can help you to reduce the productivity drain.
The rise of social media has created a
significant management dilemma for small
and medium-sized businesses.
While business leaders understand that
sites like Facebook and Twitter can be a
major distraction, they are also reluctant to
impose draconian and overly restrictive IT
policies on their employees that can create
a culture of paranoia and mistrust.
One suggested strategy is from a recent
Harvard Business Review article1. It is
called radical transparency and involves
businesses rethinking how they approach
productivity, technology, and staff
education and measurement.
Radical transparency is essentially the
belief that if a business makes the majority
of its corporate information transparent
internally, it will foster greater trust,
improve morale and boost productivity.
This in turn will bolster its competitive
140
DIGITAL LEADERS 2015
advantage and increase business
performance, without leaving the business
at risk to the dangers of the internet and
potential legal issues.
Traditionally, technology lacked the
capability to provide full business visibility
across all operational areas, but this has
changed with the introduction of cloudbased activity monitoring - putting all
information in one place - which has made
it easier for businesses to implement
radical transparency. Businesses can
now create a more productive working
environment easily and cost-effectively,
but many still haven’t taken any steps in
that direction.
In fact, according to a Gartner report2,
only 10 per cent of businesses have
implemented activity monitoring safeguards
to monitor their employee’s social media
use for security breaches, which poses the
question – how does a business know what
its staff are exposed to?
Imagine the opposite - knowing exactly
how your staff are performing at any
time of the day. As a business leader,
you can start the day by assessing staff
performance with weekly productivity
trend analysis. You can then proactively
respond to those of your staff who are
struggling to maintain a productive role
and reward those showing progress or
signs of improvement.
At the same time, you can highlight
legal issues and educate staff better in
secure internet use.
Radical transparency curbs ‘cyber-loafing’
With the right technology solutions,
the above vision can become a reality.
However, radical transparency has to be
built around trust. Position the change as
beneficial to staff, not with the intention of
becoming the workplace equivalent of Big
Brother watching their every move.
By making everyone’s activities visible,
even yours as the employer, everyone
gains insight and can hold an informed
opinion. If a manager’s productivity and
activity is visible and comparable, it will
build further trust and strengthen their
leadership position.
Furthermore, when distractions are
removed with flexible policies, personal
and team accountability drives upwards
while offering incentives for staff to focus
on those time-sensitive activities.
However, it’s important that your
workforce embraces the idea that
radical transparency will ultimately be
to their benefit. It is not about spying
on personal web surfing activities, but
rather represents a proactive approach
to understanding how they are spending
their time at work, so they can be
more productive and ultimately, more
successful.
The noted management consultant
H. James Harrington once said that
measurement is the first step. If you
do not measure something, you do
not understand it. And if you lack
understanding, you cannot control the
outcomes. Without control you will never
improve.
Think of this in the context of your
business. First you measure what is
causing a drop in productivity, and then
you seek to understand why it is happening
(boredom, lack of work, poor leadership
and so on). Once that information is
grasped, implement control methods
to - in the long run - create a sustainable
management and IT methodology that
delivers business improvement.
Knowledge meets power
Harvard Business School conducted an
experiment and found that people who
successfully resist the temptation to use
the internet for personal reasons at work
actually make more mistakes.
It is harder for people to learn new skills
as well, mainly because they have reduced
access to the knowledge traditionally
available online.
Create a culture of employee selfassessment and accountability. Locking
staff away does not prevent distractions
from occurring, especially with the advent
of smartphones, but if you explain why a
strategy of radical transparency has been
implemented, staff should begin to control
their own behaviour and feel more inclined
to work productively.
If a dashboard of how everyone on
the team is performing is available,
workers will feel the urge (due to natural
competition in business) to increase their
work output. It also allows management
to reward those employees by using
data, rather than qualitative metrics like
personal preference and unfounded loyalty.
Radical transparency also keeps the
business safer and reduces the risks of
legal issues. If management can see what
people are accessing and communicating,
they can highlight individuals who are
leaving themselves open to phishing
or threats like malware. One-on-one
education can then take place to explain
the reasons to change, thus keeping your
business data protected and your staff
more vigilant.
The benefits are many, but ultimately
a transparent, rewards-based culture
(management should be measured as
much as their employees) will ensure a
more efficient business unit that works
together, driving business success in the
process.
References
1 https://hbr.org/2012/10/why-radicaltransparency-is-good-business/
2 www.pcworld.com/article/256420/
gartner_predicts_huge_rise_in_monitoring_of_employees_social_media_use.html
2015 DIGITAL LEADERS
141
COMPUTER ARTS
Image: iStockphoto
COMPUTER
ART
TO D AY A N D TO M O R R O W
Catherine Mason, from the Computer Arts Society (CAS) and author of A Computer in the Art Room: the
origins of British computer arts 1950-80, looks back on a year of digital art, commemorates the work of
Alan Sutcliffe and looks ahead to a new year of computer art.
The diverse and eclectic world of
computer and digital arts had an exciting
year and the BCS special interest group,
the Computer Arts Society (CAS), is pleased
to see continued developments, especially
within museums and galleries that contribute
to raising the profile of the genre and in
particular that of British artists throughout
this country and internationally.
We were treated to several museumquality exhibitions this past year including
the major show Digital Futures at the
Barbican Art Gallery (July to September)
www.barbican.org.uk/digital-revolution.
Focusing on ‘things that are creatively
ambitious for their time’, from the 1970s
to present day, and with many special
142
DIGITAL LEADERS 2015
commissions, this successful show proved
the Barbican is increasing its speciality in
digital art. It is expected to tour at home
and internationally at least for the next five
years.
CAS member Ernest Edmonds
organised Automatic Art: human and
machine processes that make art in July
at the GV Art gallery, London,
http://interactlabs.co.uk/files/5980/
gv-art-de-montfort-university-automaticart-press-release.pdf.
A survey of 50 years of British artists
who create work by using personal
rules or by writing computer programs
to generate abstract visual forms,
it demonstrated that the systems
artists’ approach of the 1970s and
computer artists overlap and have many
connections.
Gustav Metzger’s show at Kettles Yard
Gallery, University of Cambridge, in August
focused on his lesser-known auto-creative
art. Gustav became famous for inventing
auto-destructive art in the early 1960s
and was one of the first artists in Britain
to write of the potential for computer
use in art. CAS lent exhibition materials,
including copies of PAGE, our periodical
first published in 1969 with Gustav as the
founding editor.
The name PAGE was chosen by Gustav
as initially there was only one page
available for printing (due to costs) and it
was a pun on the concept of paging (the
use of disk memory as a virtual store,
which had been introduced on the Ferranti
Atlas Computer).
The questions Metzger posed about
technological responsibility in the 1960s
seem as pertinent, perhaps even more so,
today. Although we still may not have any
satisfactory answers, art can remind us to
keep asking and challenging the status quo.
An important collection of early plotter
drawings by German artist Frieder Nake
were acquired by the Victoria & Albert
Museum with funding assistance from
the Art Fund. The acquisition of a full set
of such early algorithmically-generated
works will enable study of the progression
of the artist’s process.
To use digital computing for artistic
aims, at a time when digital computing
itself was in its infancy, required a great
leap of faith by pioneers. This adds to the
impressive computer art collections of
the V&A and its continuing profile as the
National Centre for Computer Arts.
In a first for a UK museum, the V&A
appointed a Games Designer in Residence
late in 2014. Sophia George, inspired by
the museum’s collections, especially that
of William Morris, worked with the public
and schools to develop a game based on
Documentation was held at the ICA,
London, to explore the impact and
continued relevance, nearly 50 years on,
of the pioneering Cybernetic Serendipity
exhibition held there in 1968. This groundbreaking show was the first international
exhibition in the UK devoted to the
relationship between the arts and new
technology and featured collaborations
between artists and scientists.
In addition to an assortment of
monthly talks and presentations, the CAS
programme this year included the annual
EVA Conference held at the London HQ of
BCS. (www.eva-london.org) This is always
a good opportunity to apprise oneself
of the latest international research in
visualisation, design, animation and digital
archive documentation. Michael Takeo
Macgruder, who has been developing
interactive digital art since the late 1990s,
exhibited his new dome-based work Data
Sea as part of an art exhibit curated by our
chairman, Dr Nicholas Lambert.
In line with CAS’s mission to support
and propagate computer art, we support
the Lumen Art Prize, a major international
competition with 800 submissions from 45
countries http://lumenprize.com.
An exhibition of all the prize-winners
was launched at Llandaff Cathedral and
Sophia George, inspired by the museum’s
collections, especially that of William Morris,
worked with the public and schools to develop a
game based on Morris’s well-known Strawberry
Thief pattern.
Morris’s well-known Strawberry Thief
pattern. It was released in October for iPad
and iPhone. BAFTA-award winning Sophia
aims to change perceptions of game
design, not least that young women can be
involved in games as well.
www.vam.ac.uk/content/articles/g/gamedesigner-resident-sophia-george
In October an exhibition and associated
conference Cybernetic Serendipity: A
will be shown in Athens, New York, Hong
Kong and London with a joint show at
Ravensbourne and Birkbeck in March
2015. Digital artists in all areas are
encouraged to submit to this prize in 2015,
as it will also enable them to connect
with a broader public. The vast array of
different styles and approaches that this
prize attracts demonstrates the vibrancy of
contemporary technological art.
Also in 2015 we will be looking forward
to an exhibition in Rio de Janeiro featuring
the work of CAS members Harold Cohen,
Frieder Nake, Ernest Edmonds and Paul
Brown. ‘Códigos Primordiais’ (Primary
Codes) will occupy all three floors of Oi!
Futuro exhibition space in Rio:
www.oifuturo.org.br/en
In February we mourned the loss of
one of our founder members and first
chairman, Alan Sutcliffe. It was his idea
to gather together a group of like-minded
people who were interested in the
creative use of computing, for exchange
of information, discussion and exhibition
opportunities. Together with George
Mallen and R John Lansdown he founded
the CAS in 1969, the first practitioner-led
organisation of its kind in Britain. Alan was
also a stalwart of BCS; in the late 1970s
he was elected Vice President for the
specialist groups.
His deep knowledge of and enthusiasm
for the subject together with the
welcoming, inclusive nature of his
personality will be greatly missed. Look out
for a special issue of PAGE dedicated to
him later in 2015.
If you are not yet signed up to our (free)
emailing list, then please visit
www.jiscmail.ac.uk/cgi-bin/
webadmin?A0=CAS
See also our website:
http://computer-arts-society.com and
Facebook www.facebook.com/pages/
Computer-Arts-Society/303023289760986
All are welcome.
About the author
Catherine Mason, is a committee member
of CAS, the author of A Computer in the Art
Room: the origins of British computer arts
1950-80, and writes a regular
column on contemporary computer art for
the BCS website and ITNOW magazine.
www.catherinemason.co.uk
2015 DIGITAL LEADERS
143
THE LAST PAGE
Reproduced from xkcd.com
E N H A N C E
Y O U R
I T
S T R A T E G Y
D I G I T A L
LEADERS
2 0 1 5
We hope you enjoyed this edition of Digital
Leaders and found the articles interesting
and stimulating.
As a reward for getting this far we
thought you might appreciate some light
relief courtesy or our good friends at the
excellent xkcd.com.
bcs.org/digitalleaders
PLAN FOR THE
FUTURE
Change drives business success. IT innovation drives change.
You are the one in the driving seat, but you need the
capability within your IT team to deliver that change.
Editorial team
Justin Richards Editor
Henry Tucker Editor-in-Chief
Jutta Mackwell Assitant Editor
Grant Powell Assitant Editor
Brian Runciman Publisher
Production
Florence Leroy Production Manager
144
DIGITAL LEADERS 2015
The opinions expressed herein are
not necessarily those of BCS or the
organisations employing the authors.
© 2015 The British Computer Society.
Registered Charity No 292786.
Copying: Permission to copy for
educational purposes only without fee
all or part of this material is granted
provided that the copies are not made
or distributed for direct commercial
advantage;
BCS copyright notice and the title of the
publication and its date appear; and notice
is given that copying is by permission of
BCS.
BCS The Chartered Institute for IT
First Floor, Block D, North Star House,
North Star Avenue, Swindon, SN2 1FA, UK.
T +44 (0)1793 417 424
F +44 (0)1793 417 444
www.bcs.org/contact
BCS, The Chartered Institute for IT, is the only
professional body dedicated to IT. We are here to turn
your team into competent and confident innovators.
Our globally recognised IT skills framework SFIAplus
will align your IT capability with your business objectives
to deliver successful business transformation.
Start driving change today at
bcs.org/leadersforchange
Business analysis • Agile • Project management • SFIAplus • Consultancy
Incorporated by Royal Charter 1984.
Joss Creese BCS President
Paul Fletcher Chief Executive
© BCS, The Chartered Institute for IT, is the business name of The British Computer Society
(Registered charity no. 292786) 2015
BC1097/LD/AD/0315
Fly UP