An authority on information security, Professor Fred Piper, spoke on the need for balance in security at the University of Derby on March 16.
Prof Piper, of Royal Holloway University of London, began with what he called ‘some unfortunate facts’. “There is no such thing as 100 per cent security. Why don’t we just admit it, and get on with doing the best we can?” As he neatly put it, in his non-technical talk to a student audience: “In theory there is no difference between theory and practice. In practice there is.” He likened security to road safety. Roads are there to enable fast travel. Can we eliminate all accidents? The obvious answer is no; but how many accidents will we, society, accept? Of course, we try to minimise accidents. “In practice you just have to accept that accidents are going to happen if you want fast travel.” As he added, in theory to have safe roads you could go back to men walking with a red flag in front of motor vehicles; this while safe would make road travel impossibly slow. Hence the world has agreed standards for road safety and use. Why is there no such thing with the internet? Prof Piper asked. The main reason, he went on, is that road usage grew more slowly than use of personal computers; society had time to think of, for example, driving licences. He asked if it was possible to be honest and realistic about imperfect levels of security; without being accused of being negligent, or irresponsible, or uncaring. Stressing that he was making an old point, Prof Piper spoke of security, and insecurity, as each having costs, now as in the early days of computers. “The objective is to minimise the sum of the two costs.” That is, you do not aim for total security, just as you do not want total insecurity. Costs and losses, he added, may be different for different players – a bank or other owner, an administrator, and customer.
The theory, the ideal, is that an inventor, entrepreneur or government suggests a new technology or measure. The security people identify weaknesses; and the civil libertarians consider the human rights issues; and the product or service is launched. What determines the launch date? Not that security saying the product is perfect – that may delay the project, or lead to it costing more, and customers may take another product. As soon as it’s launched, it faces criminal or other attackers – often better resourced than the inventors, and without a deadline. What happens if attackers find a weakness? As Prof Piper explained, the pragmatic decision is to deliver the system, and then fix the security, later, if there should be any problems. He spoke also of the complexity of networks; it may not be clear to computer users any more what the ‘complete system’ is. One person’s vulnerability may cause a threat to others – such as a computer virus, worm or Trojan, exploiting computers. Prof Piper summed up the tensions: between security versus convenience; and security versus privacy; and security versus business opportunities. And almost all security issues have other consequences; he gave the example of public space CCTV introduced for street safety, which has come to have other uses.
He went on to authentication. “Whatever security you have, it can only work if you can identify people and devices. Authentication is the cornerstone of information security. Authentication is fundamental.” He gave the CIA principle – that information security should be about confidentiality, integrity and availability, whether you authenticate with one or two or three factors (such as something you know, such as a password, and something you own, such as a plastic card or token). The process of authentication confirms that the person being recognised is the person registered. Prof Piper added one possible catch: “If I impersonate you at the registration process, I have impersonated you forever.” He gave a personal example of how the wish not to inconvenience the customer can take over from security rules. He recalled having to replace a credit card. On ringing an automated telephone service, it asked him to give the ‘memorable date’ he had been asked as a security question, years before. As he guessed incorrectly, he was put through to a human operator. While the security rule should have meant the credit card company did not issue a new card, without further checking, in reality the human operator asked for a new ‘memorable date’. While he said that there have been an unreasonably high number of recent leakages of personal data, he did believe the government is capable of looking after personal data; but suggested that personal data is not treated with the respect it deserves. He differentiated between different kinds of personal data. Some is personal, such as our health records. Some identify us, such as passwords, and our bank account or National Insurance numbers; or mother’s maiden name. A danger, he said, lies in using these static pieces of data about us, as security passwords, that others may know and use.
That said, he returned to the need for balance, and not to over-react after a security breach, for example if a Ministry of Defence or other organisational laptop is stolen. If the reaction is to withdraw all laptops, people may only use their laptop from home instead, making more insecurity. He spoke of the dilemmas of law enforcement. Law enforcers, he believed, do not want to intrude – have no interest in intruding – into private lives; nor do they want to hinder e-commerce. They do want secure communications between themselves. That is, they want encryption; and they want to break encryption. Should students be allowed to send encrypted emails? The response from the largely student audience was yes. But should terrorists be allowed to send encrypted emails for their plots? Here was the balance between rights of the individual, and the protection of society. As for encryption, he contrasted the 1970s and 1980s to today. “In the 1970s and 1980s, the use of encryption required dedicated encryption devices which meant that the few countries who produced such devices were able to control its use worldwide. However now that we all have personal computers anyone can implement encryption and that control has vanished.”
He went through the criteria for security policies. Employees must read the regulations, understand and believe in them, and adhere to them. He said: “It’s frightening how many policies exist where it’s impossible for employees to comply. An incomprehensible or impossible policy may force employees to violate policy.” He gave the example of people having to write down passwords if they have too many, or the passwords are too long and complex. In other words, people fear more being locked out of their computer system through not knowing the password, than letting strangers in. if your concern is to prevent unauthorised access, that may mean more staff ring help-desks, unable to get into their computers. He summed up: “Security policies must be understandable, reasonable, and you must be able to adhere to them.” He spoke of the insider threat, and suggested that we should distinguish between the fraudulent, and the accidental insider. Those insiders may be lazy, incompetent; here he gave the example of the loss of 25 million HMRC records on a lost disc in the mail. “So more and more you are finding security awareness campaigns, making people aware of the consequences of lost discs, losing information, and so on.” He went on to the ‘human factors’. Why do people break the rules? Evil intent, carelessness, unreasonable rules, or misunderstanding? “It’s amazing how often it’s either an unreasonable rule or a complicated rule and people break the rules because they don’t really understand them or because they are forced to break them.” Security professionals, he said, need to understand the technical and business issues, and understand the position of employees. “The challenge is to establish a security culture where everyone accepts that security is important, is their responsibility, and everyone is ‘on side’ with the security policy.” That may be idealistic, he admitted, but pointed to steps in the right direction. Theory and practice are getting closer together; academia, industry and government are working more closely together. He gave the examples of the internet safe use website Getsafeonline; bodies such as ISAF and IISP – the Institute of Information Security Professionals (www.instisp.org); and qualifications, through ISC2, SANS, ISACA, and the BCS (formerly the British Computer Society, now describing itself as the Chartered Institute for IT); and vendors’ certificates. “All of these are steps in the right direction.”
He closed by talking about cyber-security. Britain has a cyber security policy; but any national solution to the cyber world is of limited use, because cyber is a world arena. Hence international organisations, such as ENISA and IMPACT, both attempts to introduce some policing to the internet. “Is it too late? Maybe. Is it desirable? It depends what you think the internet is for.”
Answering questions from the floor, he spoke first on biometrics. “If you really are concerned about identifying people, biometrics may be the answer. Now there is a lot of debate about what is a biometric,” giving the example of retina scans, and even the lobe of the ear. Apart from fingerprints, he queried whether other biometrics have been around long enough for us to know if the biometric we have at 30 (our face, for example) will have the same properties as when we are 70. “There are all sort of niggles, but the move towards biometrics is coming. They are coming, and the technology is improving … it has obvious possibilities.” Answering another question, he made the point that the cyber-conflict that saw Estonia closed down for days was a powerful warning.
Professional Security asked the question – we are all having to get more savvy about information security, but is the pace of technology such that we are falling ever further behind? “Of course!” Prof Piper replied. He gave the example of the merging of mobile phones and PDAs, causing a security problem.
About Prof Fred Piper: He is director of the information security group at Royal Holloway, University of London, at Egham in Surrey. Visit www.rhul.ac.uk. Also twitter.com/isgnews.
Background
Prof Fred Piper was appointed Professor of Mathematics at the University of London in 1979. In 1985, he formed a company, Codes & Ciphers Ltd, which offers consultancy advice in all aspects of information security. He has acted as a consultant to over 80 companies including a number of financial institutions and major industrial companies in the UK, Europe, Asia, Australia, South Africa and the USA.
His consultancy work has been varied and has included algorithm design and analysis, work on EFTPOS and ATM networks, data systems, security audits, risk analysis and the formulation of security policies.
He has lectured worldwide on information security, both academically and commercially. He has published more than 100 papers and is joint author of:
* Cipher Systems (1982), one of the first books to be published on the subject of protection of communications
* Secure Speech Communications (1985)
* Digital Signatures – Security & Controls (1999)
* Cryptography: A Very Short Introduction (2002).
Light refreshments are provided. You can see the previous years’ speakers – including Peter Yapp and Robert Schifreen – on this link –
http://www.derby.ac.uk/computing/disc/past-talks



