Work with your Security and Governance teams to thwart cyber attacks

A Petya ransomware attack suspected to be a modified EternalBlue exploit is currently spreading around the world as we go to press, with UK and European organisations already affected and shipping company Maersk and ad agency WPP announcing problems with systems down.

With only a few days since the attack on the UK Government on Friday 23rd June, security experts are describing such high profile attacks as the ‘new normal’.  Weak passwords on email accounts were to blame for around 90 parliamentarians being attacked.  An official spokesperson commented that users had failed to adhere to official guidance from the Parliamentary Digital Service.  Immediate remediation of disabling remote access was put in place as a precaution whilst further investigation were made.

This follows hot on the heels of last week’s report by Which, revealing that communications giant Virgin’s consumer Super Hub 2.0 router was found to be vulnerable to hacking for those who had not changed the default wifi password setting, felt by experts to be too short and not sufficiently complex.  Virgin are not alone amongst Internet Service Providers for issuing relatively simplistic wifi keys according to penetration testing experts.  Future success in thwarting attack will require 1) a change of culture from consumers to proactively change the default password on any wireless device and 2) for retailers to ensure that directions for changing the password are immediate to access the service, easy to read and quick to do.

And all of this just one month since the WannaCry cyber attack on NHS England which was amongst around 70 organisations hit worldwide.  Brian Lord, former Deputy Director for Intelligence and Cyber Operations at GCHQ commented in May that this was due to a change from low level theft and use of ransomware in the past few years to now internationally organised crime.  Todays criminal networks could generate sustained and co-ordinated attacks into the backs of ageing IT systems, delivering a simple tool at mass scale to vulnerable areas – in this case, systems where Microsoft security patches hadn’t been updated.

The clear messages from these tales of woe are:

•    Ensure effective security and governance procedures are in place for businesses and institutions – and that these are shared, understood and abided to by all staff without exception through regular training and education awareness.
•    Consider two factor authentication and more intelligent solutions around identity management and password tools to keep the door closed to wrongful access.
•    Protect older, more vulnerable Operating Systems through regular security assessments and vulnerability detection programmes to scan your networks and find holes in perimeter security to help target your patching priorities.

Rome wasn’t built in a day, but organisations that do not have strong and effective preventative measures can easily fall in one day.  Keep security at the forefront of your thinking and actions.  Read our full article on Ransomware here

UK prepares to open its doors to the ‘National Cyber Security Centre’

The new body responsible for the UK’s cyber security which was unveiled by the Chancellor last October has been named and will officially open its doors in London in October 2016.

The National Cyber Security Centre (“NCSC”) will be the pooling point for guidance and communications on cyber security. Historically, this function has been handled by GCHQ, however, as a secret intelligence service, this has been off limits to business resulting in a lack of clarity in this key area of national risk.

The new entity will have one foot in the closed intelligence world and the other in the public and corporate space. The NCSC will work with regulators such as the Bank of England to provide advice to the private sector and with government departments and national infrastructure groups.

In the event of a cyber attack, the liability would still vest with the entity that owns the data, but the NCSC would be the first port of call should another major cyber breach like the TalkTalk incident taking place. Additionally it will be at the fulcrum for setting standards for the financial sector to increase resilience against cyber threats which could impact the UK economy.

In an interview with the BBC, Matthew Hancock, the Minister for the Cabinet Office said, “We need to have a one-stop shop that people inside and outside government can go to”, saying that the NCSC will aim to be the authoritative voice on information security in the UK.

Designed to bring the UK’s cyber expertise into one place, the Board appointments have been announced: Ciaran Martin, currently a senior official at GCHQ, will be the NCSC’s first head and joining him will be Dr Ian Levy, as Technical Director (also from GCHQ).

~~~

All of this follows recent news of the opening (also in October 2016), of a national cyber security academy at Newport in Wales, aimed at training people to fight internet crime. Computer forensics and computer security undergraduates will be trained to work with businesses to identify cyber challenges. If successful, the course will be developed into a full-time cyber security degree. With some companies spending £16m per year to protect themselves online, this couldn’t come sooner for business as cyber crime becomes viperishly more intelligent.

image

Week’s Technology News – 27th February 2015

Boards acknowledge cyber risk on their 2015 agenda

Back in 2013, following a KPMG report that cyber leaks at FTSE 350 firms were putting the UK’s economic growth and national security at risk, the heads of UK intelligence agencies MI5 and GCHQ then asked leading businesses to take part in a Cyber Governance Health Check.  The results were a stark wake up call.

As we reported in our blog on 19th December, Board engagement is pivotal to the success of any cyber security plan and thwarting the eye popping 80% of preventable attacks in 2014.

The 2015 Cyber Governance Health Check has just been published and reveals that 88% of companies are including cyber risk on their Risk Register with 58%+ anticipating an increased risk over the next 12 months.  However, only 21% say their boards get comprehensive information and only 17% regard themselves as having a full understanding of the risks. This is clearly insufficient in the light of the continuing squeeze on data security and compliance measures.

You do not have to be a FTSE 350 to want continued trust from clients and the comfort of having up to date data security measures.   So wake up and smell the budding roses of 2015 and do your own health check review now:

  • Re-evaluate what the unique crown jewels of your organisation are (key information and data assets) as they may have changed in in the 12 months.
  • Review risk from any 3rd party suppliers and avoid contractual complacency – get into active compliance.
  • Be pro-active about risk and create a competitive advantage of rivals.
  • Arrange for a ‘pen test’ and get in shape to be security fit for purpose in 2015.

GCHQ


Windows Server 2003 is dying – but Windows Server 2012 will offer an elixir

With the forthcoming end of life for Windows Server 2003 and cessation of support from Microsoft on 15th July 2015, the effect will be severe for the many business still running this server in their data centre with exposure to cyber attack, unless considered steps are taken now to plan for upgrade.

Microsoft’s own survey recently confirmed that there were 22 million ‘instances’ (database environments) with WS2003 still running.

Organisations clearly need to plan their migration strategy – and quickly – if they are going to protect their infrastructure. End of support means no patches, no safe haven and no compliance.  Any company continuing to run WS2003 beyond July will fail regulatory compliance audits which could result in losing commercial contracts. So delays are not only expensive but highly risky.

The advances in the data centre with Windows Server 2012 RT offer integrated virtualisiation of compute, storage and networking along with enterprise class scalability and security.  The Cloud options of Microsoft Azure and Office 365 will deliver applications faster and increase productivity and flexibility – and take away risk.

Security implications

  • Software and Hardware compatibility – If you are running a mixture of physical and virtualised servers, then priority should go to addressing physical assets, as most WS2003 licences are tied to the physical hardware.
  • Compliance against many industry requirements has moved from a best practice ‘good to have’, to a mandatory requirement, so no option.
  • Payment Card Industry Data Security Standard (PCI DSS) v2, v3 – providing adequate assurance levels to meet the requirements of PCI will fail.
  • UK Government – connecting to the Public Services Network (PSN), whether through an assured connection or via an Inter Provider Encryption Domain (IPED) will be a headache if updates cannot be supported securely.
  • Industry standards Industry standards such as ISO 27001:2013 and the Cloud Security Alliance all require you ensure your systems and applications are up to date.
  • Disaster Recovery and Resilience  How do you re-start servers that are no longer supported? If DR is key to you business then migrating is a necessity will be fairly expensive.

Planning to move

  • Integrate your servers and their lifecycle into your strategy and risk management process.
  • Check what the servers do for you and do data mapping, flow and services exercise.
  • Identify your core assets and check them against confidentiality, integrity, availability and likelihood of compromise to help future design and investment decisions.
  • Create fit-for-purpose security architecture within your Cloud (ie should you need to retain legacy data which is rarely used – create security zones using layered firewalls, ingress and egress controls, file integrity and protective monitoring.
  • Test – lots – and then get a 3rd party certified security professional to conduct an ethical hack.
  • Failure to plan is planning to fail – do not let your business suffer by putting your head in the sand.

885284

The Week’s Technology News – 19th December 2014

IT security needs embracing in the boardroom
Talking from GCHQ headquarters this week, Minister for the Cabinet Office, Francis Maude has urged businesses to make IT security a boardroom issue.  Amicus ITS has recommended this point repeatedly in blogs this year.  Government is now urging businesses to review IT security as an integral part of strategic thinking for the Board, to ensure secure data management remains at the heart of the agenda.

With recent breaches affecting major household names both in the UK and the US, Maude warns against complacency:  “All companies, large or small, face threats from vulnerabilities on a daily basis”.

The Government’s launch of Cert UK earlier this year, created a cyber security information sharing partnership, now enabling 750 organisations to exchange information in real time on threats and vulnerabilities occurring.   Maude pointed to GCHQ data which showed that 80% of attacks were preventable, if best practice was followed.

As organisations are reflecting on 2014 with their staff at Christmas parties up and down the land, a cautionary ice cube should be travelling down the spine of any Board members whose businesses have not thought to place IT security at the forefront of their business continuity plans.  For them, January will be the time to really start pulling this into focus on the 2015 Agendas to review, consult, embrace and invest as required, to ensure the bottom line of their business is not threatened – either profitability or reputation.

Professional header image @ 1000 px

Nats on the rack for IT system failures
Thousands of travellers in UK airports were delayed last weekend due to a software problem from a faulty line of coding at the London Air Traffic Control Centre at Swanwick in Hampshire. National Air Traffic Services (Nats), which controls 200,000 m2 of airspace, reportedly had a power system failure on an internal telephone switch controlling nighttime ‘standby’ to daytime ‘live’ operation.

The partially privatised company (owned 49% by the UK Government, 41.9% by The Airline Group, 4% by Heathrow (formerly BAA)) and 5% by Nats’ employees), has been running air traffic control for commercial UK flights since 2002.

The company handled over 2.1million flights last year, carrying 220 million passengers in the UK.  Nats had problems with its IT in 2008. Additionally, the CAA criticised Nats in a report about a telephone failure which grounded 300 flights in 2013 – and flights in Southern England were delayed earlier in 2014 due to “technical problems”.

The problem software came from a package originally being developed by the US air traffic control network. When this project collapsed, it was left to Nats to work through the outstanding development to make it serviceable and raised the price of Swanwick’s delivery by £150m from an original £475m budget.  Some of the blame is said to lie with an aged IT infrastructure.  Nats CEO explains, “There are 50 different systems at Swanwick and around four million lines of code”.  Nats’  decision last year to make a significant number of its most experienced, older IT engineers redundant when these were the specialists most used to working with the older technology, will not have helped. Especially worrying with this failure is that the fault had not been seen before.  The latest incident follows accusations about a corporate failure to invest in new technology and opens Nats to an increased risk of repeated outtages in future – this despite CEO Richard Deakin’s promise that £575m was being invested over the next five years.

A CAA inquiry will now be launched to assess whether Nats has learned from its previous failures, with the risk of its licence being reviewed. It will be a bumpy ride for the UK’s Transport Secretary, Patrick McLoughlin who will be providing a full account to Parliament about what went wrong.   Clearly any organisation, whatever type, lumbered with legacy infrastructure whether hardware, software or both will see operational effectiveness and bottom line profitability suffer if the Board does not grip the bull by the horns and review and assess the best way to upgrade and secure their IT systems.

13843338_s

Microsoft and Skype attempt to eliminate the language barrier 
Back in May, during the Code Conference event, Microsoft demoed a breakthrough, upcoming feature for Skype which would let people who speak different languages talk to each other without a human translator. Users can either voice or video call each other with translations appearing in near real-time with options for spoken and sub-title like written translations.

This week Skype has opened up a preview of this new feature to Skype users who would like to give the in-development service a spin. Interested parties can go to the Skype website and register their interest. Currently the preview is limited to just English and Spanish languages with more promised coming soon. Initial reactions report – although not perfect yet – the service does exactly as you would expect, allowing two people who can’t speak the same language hold a conversation.

The business applications for an accurate auto-translator that can handle both voice and video calls are enormous. For example a single-language Service Desk could be enabled to communicate with customers worldwide without the traditional language barrier or costly multilingual employees. Skype Translator if successful will shake up the translating business even more, with the need for a dedicated human translator being brought up into question and the knowledge of knowing additional languages not being as valued as is currently.

As the technology develops and matures it is also likely we will see Skype Translator being incorporated into Microsoft’s enterprise communication tool Lync, which was recently announced to be later rebrand Skype for business, and if so, adds further reasoning for the name change decision.

The future for Skype is looking very promising and this announcement more than any so far, including the cross-compatibility of Lync and Skype makes Microsoft’s Skype acquisition in 2011 more justified than any announcement the two companies have made since. With Skype being pre-installed into Windows and tight integration with its own Microsoft account system Skype now more than ever fits very nicely into the Microsoft ecosystem.

With Microsoft’s current Mobile first, Cloud First mantra we will likely see Skype translator eventually being integrated into the Skype app for smart phones and tablets and with near real-time translations built into your phone, Microsoft may be the first to successfully smash the language barrier for all.

skype-logo-open-graph-800x420

Financial services benefiting from outside help
The financial sector has seen major changes since the start of the credit crunch in 2008.  Changes have occurred in working practice, organisational restructures, cost cutting exercises with branch closures in banking and jobs cuts with people replaced by technology as part of a digital strategy, which has seen sector employment decline by 16% since 2009.  Lloyds bank is cutting 9,000 staff as part of its digital strategy and Dutch bank ING has a similar project that will result in 1,700 staff losing their jobs.

Financial services organisations have increasingly turned towards using more third-party IT products, services and talent, as well as outsourcing their IT, which has boosted the number of workers in the IT sector.  According to an analysis by accountancy practice experts Nixon Williams, in 2009 there were 403,000 jobs in the IT sector compared to 459,000 in 2014 (12% up). In comparison, financial services jobs have fallen from 1.18m in 2009 to 986,000 today (16% down).

With the sector witnessing a major increase in automation software replacing manual roles and the rise in public expectation for truly 24×365 customer services, this places enormous pressure on financial institutions to manage such huge data volumes in highly regulated, highly secure environments and needing to resist any downtime or DDos.

Whilst traditionally the banking sector will have had huge in-house IT teams, the costs, regulations and pace of technology evolution has whetted the industry’s appetite for using third parties with expert knowledge and robust solutions.  This lies alongside the disconcerting reality of often uncomfortably large legacy IT systems that continue to create vulnerabilities whilst they remain unchanged and instead rely on being patched up, versus long term strategy and commitment to invest in new IT infrastructures with more flexible integrated systems.

Some of the larger banks are starting to think laterally by turning to third parties for IT innovation to develop and implement non-core systems and apps, involving joint ventures with other institutions or even working with start up firms.  These include Sumeet Chabria, CIO of HSBC Global Banking and Markets and Deutsche Bank who have recently set up a JV innovation project with IBM, Microsoft and Indian IT services firm HCL Technologies to improve its digital credentials.

The motivation to sharpen the pencil, starts to look clearer when recent studies such as those   from specialist retailer Bizrate Insight reveal that 72% of the public still trust banks with their details, over that of retailers.   However there is no room for complacency over ‘trust’.  Potential competition for marketshare should they move into banking could be on the horizon from established transactors Paypal and Amazon who jockey for position on the trust rankings at 48.9% and 45.4% respectively.   Tech giants Apple and Google lag further behind at 21.4% and 12.9% respectively.  Nonetheless all of these, as well as Facebook, all have systems that contain details about people and businesses and handle monetary transactions.   So the circling pirranhas angling for additional income streams and greater global dominance may include some new names in the future.

33596896_s

Public Sector changing outsourcing habits in 2014
Market watcher ISG’s north Europe President, John Keppel, reports that the UK has seen a major boost in outsourcing from the public sector in 2014. This has included small and large contracts remaining in this country, versus being awarded offshore with spending levels nearly doubling in comparison to the UK’s private sector.

This has involved some big-ticket outsourcing deals but also a lot of mid-market government business.  Annual Contract Values (ACVs) from IT outsourcing in 2014 has risen 16% across EMEA, with France’s ACV increasing by 250%, whilst the UK with its more mature outsourcing market has seen a steady increase in line with cautious post recessionary optimism.  This is seen as largely due to the complexity of services required in the UK public sector, as well as a lack of appetite just to exploit cheaper resources from offshore suppliers.  The old adage buy cheap, pay twice perhaps resonating more closely with those responsible for procurement. “The challenge for buyers will be to understand how they can get the most value from their outsourcing efforts, and to understand the real business impact,” concludes Keppel.

Director of Sales at Amicus ITS, Les Keen comments:  “With the increase in Cloud services, this presents ever greater opportunities in 2015 for IT MSPs.  Those who can demonstrate the breadth of their experience, deliver the highest levels of data security, be a true 24×365 IT provider AND respect their customer as a business partner not a number – should see the benefit of working in this sector in 2015”.

18330013_s

End of 2014
This is our last review of IT for the year and the blog staffers at Amicus ITS would like to take this opportunity to wish all our customers and everyone reading these posts, a very Happy Christmas and a peaceful New Year.   We will be back looking at the latest technology developments and worldwide IT business news once again in January.  See you in 2015.

The Week’s Technology News – 28th November 2014

Coldfinger not goldfinger, as smartphone biometrics not a panacea

Former GCHQ boss, Sir John Adye, has just given evidence about his concerns regarding the unsupervised use of biometrics on smartphones to an audience of British MPs in the Commons Science and Technology Committee.

Adoption of fingerprint technology has taken off most notably with smartphone giant Apple’s iPhone6 and users can now make payments and access services using a fingerprint. However, as the GCHQ security expert who runs his own biometrics company commented:  “I don’t know what happens to my personal data when I use it on a smartphone… there’s no physical supervision of the system (unlike an ATM which a bank oversees)”.  “You need to design security methods… which are going to be strong to protect the interests of the individual who is using the phone and the relying party at the other end… the bank or whoever it is, who is providing a service to them.”    Apple says it uses the most technologically advanced fingerprint security and puts security and privacy at the core of the “Apple Pay” system.   But Adye also wants more transparency in the way personal information is passed to third parties.  He does not believe users fully read through the notices in the tick box procedures layering complacency, when in the background, the criminal community get ever more clever about seeking ways in.

Another biometrics engineer presenting to the Committee, Ben Fairhead, advised there were various anti-spoofing and other methods to work out whether the finger was real, but acknowledged spurious results got thrown up if for example blood flow to the finger was low, which would reject the verification.  In a twist to the old tales of criminals smuggling a file into prison now we have criminals adding iron filings to fake fingers to mirror the conductivity of human skin.  From the Government’s point of view there will come increasing pressure to demonstrate they have weighed up the increased approval of biometrics in border controls and public services with sufficient measures to safeguard against the risks and possible flaws.
iphone 6

Forget me not
With the ‘right to be forgotten’ now in situ, the European Commission has finally published guidelines to tell search providers how to handle individuals take down requests (first discussed in our blog of 16 May 2014).

Mostly requests synch with what Google has already been doing – and the balance is successfully struck between an individual’s search for privacy against the public’s rights to know something.  One area that has created consternation though in the EU is Google’s tendency to warn both users and site operators when it takes a notice down. This lacks legal basis according to the Commission, when they could be contravening data protection laws.

This was recently experienced by US singer Barbara Streisand, who sought to have some online information taken down, but the ensuing actions actually drew attention to the very issue she was trying to keep secret.

The Commission also wants a level playing field so it applies to all web domains, not just removing them on country centric ones (ie. ‘.co.uk’ or ‘.fr’) and leaving uncensored results on a ‘.com’ page.   This comes at a time when Microsoft’s ‘Forget.me’ has just started reviewing requests through its Bing search engine and using the EU advice as a template, but it remains to be seen if the guidelines can please both sides AND the regulators.
bing-right-to-be-forgotten_thumbnail