NHS Confidentiality Guidance

It may be a source of some confusion that the NHS has five live guidance documents relating to the duty of confidentiality.

The original 2003 guidance document is the non-statutory “Confidentiality: NHS Code of Practice” issued by the Department of Health. It offered detailed guidance on:
• protecting confidential information;
• informing patients about uses of their personal information;
• offering patients appropriate choices about the uses of their personal information; and
• the circumstances in which confidential information may be used or disclosed.

Whilst of relevance  to all using confidential patient data, the primary audience was  data protection officers and Caldicott guardians. Its usefulness is limited by its age. It does not take into account the recommendations and implementation of the second Caldicott review, or the radical changes in NHS structure, management and technology since 2003.

In 2013 the HSCIC issued “A guide to confidentiality in health and social care”, supported by a References Document, in response to the Caldicott 2 Information Governance Review. Whilst neither mentioning nor rescinding the 2003 Code it covers practically the same ground and claims that it “… provides readers with a full picture of what they should do and why” and is written with the express intention that “readers do not have to consult multiple sources of guidance”. It is statutory guidance issued under s265 of the Health & Social Care Act 2012 and health and social care bodies have a duty to have regard to it.

The audience is wider than the 2003 Code, and it is very much a practical guide to front line staff making confidentiality and disclosure decisions. The 2003 Code remains an important document and does have some useful items such as flow charts which are not in the 2013 Guide.

The third document is a formal “Code of Practice on Confidential Information” issued in December 2014 under s263 of the same Act. Again health and social care bodies (expressly including private contractors) must have regard to it in delivering services, but the audience and aims are different. This Code is aimed at the organisational level – those responsible for “setting and implementing organisational policy, within the organisations”.
It relates to “the collection, analysis, publication or other dissemination of confidential information concerning or connected with the provision of health services or of adult social care” rather than “the direct provision of care, related record keeping or documentation facilitating the handover of care from one care provider to another” which is covered in the 2013 guidance.
Finally there is NHS “Supplementary Guidance: Public Interest Disclosures” issued to support the 2003 Code, updated in 2010. On this special topic it remains the key guidance.

Posted in FOI | Leave a comment

The “Legal Basis” for processing sensitive health data

There are several occasions when the legal basis for processing health data needs to be identified, particularly in relation to data sharing and processing arrangements and when carrying out a Privacy Impact Assessment. This discussion is limited to those occasions when explicit consent is not available and the processing is not part of the direct care pathway. It also does not go into objections to such processing.

Legal Requirements
Such processing must satisfy the applicable data protection legislation. It must also avoid the common law duty of confidence. These are separate and distinct issues. Unfortunately there has been a historic tendency to confuse or merge the issues. The common law duty effectively limits the ability to share to direct care (where consent may typically be implied from the consent to treatment).
So when talking about legal basis there are in fact two legal bases required:

(1) The legal basis for processing confidential data in compliance with the common law duty of confidence
(2) The compliance with the first data protection principle in terms of identifying the required schedule conditions

Current Position
In the absence of consent the only likely bases for dis-applying the common law duty of confidence are statutory authority under S251 National Health Service Act 2006 and the Health Service (Control of Patient Information) Regulations 2002 , or a public interest justification. Public interest can itself be a basis for s251 authority, but s251 has the alternative basis “in the interests of improving patient care” which means that the full “public interest” test need not be satisfied. See HSCIC Code of Practice on Confidential Information.

Secretary of State approval under Regulation 5 is the most common means. Such approvals cover the whole range of preventative medicine, medical diagnosis, medical research, the provision of care and treatment and the management of health and social care services (see S251(12(a) National Health Services Act 2006). Authorisations are limited to what is necessary (s251(4) and Regulation 7).

In the absence of a formal s251 approval it is also possible to make a decision based on public interest. In such cases there must be a proper assessment of necessity and proportionality. This is covered the in NHS “Supplementary Guidance: Public Interest Disclosures” issued to support the 2003 Confidentiality Code.

Processing under s251 or properly assessed public interest will ensure the processing is not unlawful, in terms of the first data protection principle, for breach of confidence. It does not however provide a condition for processing under either Schedule 2 or 3 of the Data Protection Act. Neither S251 or public interest create any functions – they simply enable existing functions.

The powers under the section 251 regulations only provide relief from the common law duty of confidence. Any activity taking place with the support of section 251 must still comply in full with the Data Protection Act.

Caldicott 2: To Share or Not to Share: The Information Governance Review at Page 69.

Where s251 approval or public interest exists it should not be difficult to find a Schedule 2 condition under the Data Protection Act. It will be one or more of either:
(a) Condition 3: The processing is necessary for compliance with any legal obligation to which the data controller is subject, other than an obligation imposed by contract
(b) Condition 5(b) The processing is necessary— … for the exercise of any functions conferred on any person by or under any enactment
or
(c) Condition 6 The processing is necessary for the purposes of legitimate interests pursued by the data controller or by the third party or parties to whom the data are disclosed, except where the processing is unwarranted in any particular case by reason of prejudice to the rights and freedoms or legitimate interests of the data subject.

For practical purposes however, Condition 6 may be ignored. If one of the other conditions does not apply then there will almost certainly be no case to proceed as there is no Schedule 3 equivalent to Condition 6. Which of the other conditions applies may involve a careful consideration of whether the function is a duty or a power. If only a power the necessity and proportionality tests will generally be harder to satisfy and condition 3 is not available.

A Schedule 3 condition is also required. This will typically be either:
(a) Condition 7(1)(b) The processing is necessary— … for the exercise of any functions conferred on any person by or under an enactment or
(b) Condition 8 The processing is necessary for medical purposes – “medical purposes” includes the purposes of preventative medicine, medical diagnosis, medical research, the provision of care and treatment and the management of healthcare services

As noted above, where the processing is under s7(1)(b) it is important to note that s251 does not create a function so on its own cannot be the “enactment” which invokes condition 7(1)(b). This was made clear in Caldicott 2 (see above).

The function in these cases will typically be one of the duties or powers in the National Health Services Act 2006 as amended by the Health and Social Care Act 2012. See also NHS Commissioning Board Guidance “The Functions of Clinical Commissioning Group”. Most such functions will also engage Condition 8.

In practice where “legal basis” is documented it is simply recorded as “s251 Approval”. This applies at all levels in the NHS. For example there is CAG approval for CCG processing for invoice validation. The legal basis is typically referred to as “s251”. However within the CAG approval this is identified as “Medical Purposes – the management of health and social care services”.

So here the first legal basis (lawfulness under law of confidentiality) is s251 and the second legal basis (under data protection) is “Schedule 2 Condition 6 and Schedule 3 Condition 8”. Although not identified by CAG Condition 7(1)(b) will also apply in this and most cases. The statutory duty would be (amongst others) the duty under s3 NHS Act 2006/ s13 Health and Social Care Act 2012 to commission services.

In reality this lack of clarity does not cause any compliance problems as the legal requirements for recording and publicising “legal basis” are vague.
The impact of GDPR
The basic legal framework under GDPR will not change significantly. It will be supplemented by a new Data Protection Act. The common law of confidentiality and the first legal basis issue will remain.

The second legal basis is also effectively unchanged and will require:
(1) An article 6 condition. This will be either
a. Condition C: processing is necessary for compliance with a legal obligation to which the controller is subject e.g. the duty under s3 NHS Act 2006/ s13 Health and Social Care Act 2012
b. Condition E: processing is necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller. This will include “the exercise of a function conferred on a person by an enactment”. See Data Protection Bill Clause 7(c)

(2) An article 9 condition. This will be either
a. Condition G: processing is necessary for reasons of substantial public interest, on the basis of Union or Member State law which shall be proportionate to the aim pursued, respect the essence of the right to data protection and provide for suitable and specific measures to safeguard the fundamental rights and the interests of the data subject. This includes “the exercise of a function conferred on a person by an enactment.” See clause 9(3) and Schedule 1 Part 2 Paragraph 6(2)(c)of the Data Protection Bill
b. Condition H: processing is necessary for the purposes of preventive or occupational medicine, for the assessment of the working capacity of the employee, medical diagnosis, the provision of health or social care or treatment or the management of health or social care systems. See also clause 9(1) and Schedule 1 Part 1 of the Data Protection Bill

Whilst the underlying principles remain unchanged there will be significant risks in continuing the current relaxed attitude to recording “legal basis”.

Article 13(1)(c) requires privacy notices to identify the legal basis for processing. Consideration of the use of “legal basis” in GDPR generally makes it clear this refers to the conditions in Article 6 and 9. Accordingly referring to s251 as the “legal basis” will not satisfy Article 13. Under Article 35 (data protection impact assessments) the legal basis under Article 6 or 13 will need to be identified as part of the mandatory assessment of necessity. Article 24 requires data controllers to demonstrate compliance with the GDPR. Clearly this would include recording the “legal basis” for processing. Article 5(2) also specifically requires demonstration of compliance with lawfulness – which includes identifying the Article 6 legal basis. Recital 41 requires that any legal basis be “clear and precise”.

It follows that going forward processes and guidance should be adapted to ensure that in the future the full legal basis in GDPR terms is identified. This will be particularly important for:
(1) Privacy Notices
(2) Data Protection Impact Assessments
(3) Data Sharing / Processing Arrangements

Posted in DPA | Leave a comment

How to send sensitive bulk emails

Chelsea and Westminster Hospital NHS Foundation Trust has been fined £180,000 after revealing the email addresses of more than 700 users of an HIV service. This was a classic case of putting all the email addresses of a large circulation group into the CC field so that all users saw everyone else’s email address.

Many of the email addresses would clearly identify individuals and therefore potentially reveal their HIV status to everyone else due to the nature of the email.

ICO essentially found two breaches of the seventh data protection principle:

  • Failing to use an account that could send a separate email to each service user.
  • Failing to provide staff with specific training on the importance of double checking that the group e-mail addresses were entered into the “bcc” field.

This possibly sends a confusing message. If staff had received “specific training on the importance of double checking that the group e-mail addresses were entered into the “bcc” field” but still made an error would there still have been a fine? Is that sufficient as an appropriate technical and organisational measures against unauthorised disclosure of personal data?

I would strongly argue that it is not, when dealing with the most sensitive of personal data as in this case. Well trained staff still make mistakes. In my view employers in this situation should apply poke-yoke principles and not allow this to happen. Such mailing lists MUST be handled by proper management software.

Applying that then it is arguable that the second heading should not have been cited as a causative breach, although it may perhaps have been an aggravating factor. The danger is that ICO findings may lead some to assume that either one of two preventative actions is sufficient even for extremely sensitive data. Using BCC should not be.

Posted in DPA | Leave a comment

Royal Free NHS Trust and Google UK

The murky world of Information Governance in the NHS has been further stirred by the story of the arrangements between the Royal Free Trust and Google UK.

The BBC and others have reported a “data-sharing agreement” between the two. Google will use data derived from access to 1.6 million patient records to develop an app known as Streams that will alert doctors when someone is at risk of developing acute kidney injury.

The agreement (or at least a part of it – there may be more) may be found here. It is headed “Information Sharing Agreement”. It states however quite clearly that Royal Free is at all times the data controller, and Google just a processor. If that is correct it is a data processing agreement or contract not an information sharing agreement. The two are quite distinct though they may run in parallel.

Categorising it as a processing agreement may well account for much of Royal Free’s confidence, see original BBC article and this follow up, that the arrangement breaches neither the data protection principles nor the law of confidentiality.

However it is not entirely clear whether the agreement fully complies with principle 7 of the Data Protection Act which requires a “contract”. Does the document disclose any consideration? Are Google being paid? Who will own the rights to develop and exploit any app or product they develop? Perhaps Google is getting something wider out of this arrangement – e.g. a proof of technology platform? What, really, is the “purpose” as rather vaguely set out at the top of page 3? Whose purposes are these?

Putting those concerns to one side, there is a more fundamental problem. Saying Royal Free is the data controller and Google a mere processor simply does not make it so. These terms are legally defined in section 1 of the Act and illustrated in the ICO guidance.
As noted above it looks pretty clear to an outsider that Google has its own purposes in relation to the use of this data – if that is in any way true they are a data controller as well.

One should also ask what degree of independence that Google has in determining how and in what manner the data is processed. Are Royal Fee really, or indeed competent, to direct Google in its endeavours? If you look at the ICO guidance and examples, and assume ICO is competent in this field, it is almost impossible as an outsider to conclude that Google, with all its specialist skills and knowledge as a data analyst, is not acting, in part, as a data controller. Consider in particular the ICO examples at paragraphs 29-30, and 46-47.

So perhaps it IS a data sharing agreement after all. In which case, as others have commented, it is difficult to see how it complies with the Data Protection Act. Google will be processing sensitive personal data and no Schedule 3 condition in the Act appears to apply. They have no explicit consent and Schedule 3 Condition 8 is not apt.
Further, if in any way Google’s processing as a controller goes beyond that of the direct care of those patients with acute kidney injury, as it seems it surely must, then there is a breach of medical confidentiality which cannot be overcome. Consent to share records other than for direct care cannot be implied and can only be overcome by a dispensation under s251 of the NHS Act 2006 and a search of the current register does not suggest any approval has been given. Perhaps not surprising given the apparently erroneous casting (albeit misnamed!) of the arrangement as merely “data processing”.

Posted in DPA | Leave a comment

NHS Email Armageddon 2016

All intelligent people understand that email and the personal inbox is not a good place to store and manage corporate information.

Many organisations will have formal policies which forbid, frown on, or discourage such practices. A small subset of those may even provide an attractive alternative. Nevertheless most records managers have sleepless nights and dali-esque nightmares involving bloated inboxes (10000 plus messages “just-in-case”) full of stuff (most of it personal data) which has long since passed its sell-by date and been officially deleted. Typically they will also have data protection and sometimes Freedom of Information responsibilities.

The NHS is of course enlightened and has an IG Toolkit which prevents this sort of thing. The NHSMail “Acceptable use policy plays its part in compliance: “NHSMail … is not designed as a document management system. Documents or emails that are required for retention / compliance should be stored within your organisation’s document management system in accordance with local Information Governance policies.” [This paragraph is the April Fool section – the rest is true.]

In practice one of the few tools the diligent records manager has to limit the damage is the email quota. This eventually forces people (other than senior managers who have a secret tool to increase their quota) to sort out the worst and oldest of the mess. Sometimes (rarely but we can hope) at this point they seek advice.

So wake up NHS records managers. NHSmail2 is nearly docked. The base quota is about to be increased from a typical 400mB (bad enough!) to 4gB for all users.

Posted in DPA | Leave a comment

The appeal of Alzheimers

Article originally posted in February 2016. Updated following confirmation from the society that they had withdrawn the appeal after the terms were varied by mutual consent to extend it to April 2017 and the ICO had subsequently confirmed satisfaction that the Society had met the terms of the notice.

On 5th January 2016 the Information Commissioner (“ICO”) served an Enforcement Notice (“EN”) on the Alzheimer’s Society under s40 of the Data Protection Act 1998 (“the Act”). The background is set out in the EN and need not be repeated here save to note that there is a clear history of concern on the part of the ICO as to whether the Society was complying with its duties under the Act.

The Society is appealing against the EN, or at least some parts of it and this has garnered much criticism in some quarters (see e.g. here including an adverse comment by me), to the effect that the Society should simply get on with what it should do to comply and not waste charity monies on unmerited appeals. The Society defended its actions essentially on the basis that aspects of the EN are unclear and this is particularly unfair (they said) given that they have limited resources. Resource is a relevant issue – see Schedule 1 Part II Paragraph 9 of the Act. The basis of the appeal was set out here in a December 2016 update.

As a matter of law one must have some sympathy with their position. Failure to comply with an EN is a criminal offence. See s47. An EN may therefore effectively be equated with a court order.  It is clear law in many areas that a court order must be precise and capable of being understood by the person to whom it is directed. That person must be clear and in no doubt as to the steps that are required for compliance. An issue here clearly is whether an EN should be similarly clear given the potential sanction for breach

As indicated the full basis of the appeal was not disclosed but consider paragraph 9 of the EN which perhaps most clearly raises the issues:

“Appropriate organisational and technical measures are taken against the unauthorised access by staff (including volunteers) to personal data”

That is effectively a statement of the requirement in data protection principle 7. But the EN does not say what ICO considers to be appropriate on the facts of the case. How will the Society know if and when it has met the relevant standard? Is this part of the Order sufficiently clear? Privacy and data protection advisers will know just how difficult it is in practice to set the relevant standard for a particular organisation, given that the standard is mutable having reference to the factors set out in paragraphs 9-12 of Schedule 1 Part II.

Similar issues arise on other parts of the EN but perhaps to a lesser extent. For example paragraph 5 refers to encryption “.. which meets the current standard or equivalent”. Which current standard? AES? DES? 128 bit? 256 bit? Quantum cryptography?

The appeal if it had been pursued raised therefore a matter of general importance relating to EN’s. This issue can arise in other areas. For example a subject may claim he has not received all the personal data he is entitled to following a subject access request under s7 of the Act. The ICO on investigation agrees and orders the data controller to disclose “all the personal data to which the subject is entitled”, or something similar, where one issue was whether certain information was or was not ‘personal data’. Is that acceptable or must the ICO go through the material and decide and specify exactly what he considers to be personal data so that the data controller is in no doubt at all what he needs to disclose?

A decision in this case would have brought some welcome clarity.

In fairness I must mention the defence in s47(2) of the Act. A data controller is not guilty of an offence if he “exercised all due diligence to comply with the notice in question”. There may be an issue as to whether, and as to the extent to which, the availability of this defence moderates the requirement that an EN be precise. Where the EN is as vague as paragraph 9 I have my doubts as to how much it can do so. It may be possible to assess due diligence for requirement 5 (encryption standard) where there are some definitive published guidelines but how do you test due diligence for a requirement which is not actually specified and has such a wide range of variables as set out in the Act.

The defence may actually make a non-specific EN practically worthless. Compliance could become a tick-box exercise in producing documentation and accepted risk assessments to set your own standards. Unless you set these at a level which no reasonable data controller could possibly accept the due diligence defence would always be available.

In summary paragraph 9 of the EN does not appear, as required by s40, to specify the steps required for compliance with principle 7. It simply appears to specify and repeat that the data controller must comply with principle 7 – which is already a legal duty.

Finally it is interesting to note and compare the undertaking that the Society signed in February 2010. It does contain some similarly vague terms e.g. “Physical security measures are adequate…” although the equivalent of paragraph 9 of the EN actually let them decide for itself what standard was adequate; “The data controller shall implement such other security measures as it deems appropriate… ”. There may equally be a lesson for anyone contemplating giving an undertaking – make sure there is no doubt what you need to do to comply with your own promises.

Posted in DPA | Leave a comment

Do Children Have Rights (Part 2)

How far can we rely on Information Commissioner’s Office (ICO) Guidance?

I recently commented on the issue of schools internally publicising the performance of pupils. But what about publishing to the world? The concerns expressed below may be more serious than those I referred to last time as they directly involve the guidance issued by the Information Commissioner’s Office (“ICO”).

It’s nearly exam season and the ICO was mindful last year (14th August) to publicise via its Twitter account guidance updated as recently as February 2014 on publishing results:

#DPA does not stop exam results being published in local paper, but objections must be considered ico.org.uk/for_the_public…

The guidance which can be found from the above link, is quite clear in its conclusion:

The DPA does not stop the publishing of examination results by schools, e.g. in the local press. But schools have to act fairly when publishing results and must take seriously any concerns raised.”

As is apparent ICO takes the view that the issue is essentially one of fairness and the rest of the guidance deals with this issue, as well as dealing with objections, in some detail. In effect the guidance is that providing schools make every effort to tell parents and pupils that they intend to publish, and there is no valid objection, the school will not be in breach of the Data Protection Act by so publishing.

Since an individual’s results are clearly personal data, this is in accordance with the first data protection principle that “Personal data shall be processed fairly and lawfully…” For many this may seem quite in accord with common sense. I can (just) recall a time when I was taking school exams and it was quite routine for my local paper to publish the individual O’ and A’ Level Results for all pupils and schools in the city where I lived. No-one batted an eyelid in those more innocent days.

But hang on a moment. Principle 1 also requires that “… at least one of the conditions in Schedule 2 is met.” The ICO guidance makes no mention of this at all. It only deals with fairness. A quick consideration of the available conditions shows that only conditions 1 or 6 could possibly be available.

1. The data subject has given his consent to the processing.

6. The processing is necessary for the purposes of legitimate interests pursued by the data controller or by the third party or parties to whom the data are disclosed, except where the processing is unwarranted in any particular case by reason of prejudice to the rights and freedoms or legitimate interests of the data subject.

It is not obvious which condition ICO thinks applies in this case although it is likely that he is adopting condition 6 as the guidance does say that, “In general, schools do not need peoples’ consent to publish examination results”.

If he considers condition 1 applies he is falling into the fundamental error of confusing the giving of adequate privacy / fair processing notice with consent. But condition 1 surely does not apply here. Consent, as the ICO Guide to Data Protection tells us is “…any freely given specific and informed indication of his wishes by which the data subject signifies his agreement to personal data relating to him being processed”. The guide goes on to warn that there must be some active communication of consent and this means that a school, however much notice it has given, cannot lawfully infer consent from a failure to respond or object.

If he considers condition 6 applies, it is strongly arguable that publishing fails to meet this condition on a number of grounds.

Where is the legitimate interest in the school publishing identifiable pupil information, as opposed to the alternative of publishing anonymous school performance results? Closely related one can ask, “Where is the necessity?” Case law clearly establishes that when considering “necessity” one must consider Article 8 Human Rights and that in doing so “necessity” requires that there be a pressing social need to infringe the individual’s basic right to privacy1. One can accept that there may be a pressing social need to have clear information about the overall performance of a school but that clearly does not extend to the individual’s data. In concentrating on fairness ICO appears to totally ignore the other requirements of condition 6.

Also in November 2012 ICO published an Anonymisation Code of Practice. This makes the succinct point: “However, where the use of personal data is not necessary, then the objective should generally be to use anonymised data instead.” The Guidance on exam results does not seem to accord with his own principle expressed here.

This may seem a complex analysis for processing which does not, it is conceded, seem particularly unfair. But there are two important points to note.

Firstly it is precisely these areas of the interface between fairness, anonymisation and consent which in practice cause the most difficulty for data protection officers and advisers. Whilst there is an element of risk management involved, such issues need to be solved by a rigorous analysis and application of the law. To omit or ignore such analysis when the opposite conclusion feels right is a recipe for potential disaster. In the present case it may not cause much damage. On another occasion and in different circumstances the same approach may leave a data controller open to enforcement, damages and fines.

Secondly, a hard pressed data controller needs to be able to rely on sound guidance from the ICO as regulator. If guidance is as incomplete in its analysis as this appears to be then a struggling data controller has a right to be a little aggrieved.

1In the case of Corporate Officer of the House of Commons v Information Commissioner [2008] EWHC 1084 (Admin), the High Court said at paragraph 43 that ‘necessary’ in Schedule 2 condition 6 meant that there must be a “pressing social need” for disclosure.

Posted in DPA | Leave a comment

Do schoolchildren have rights?

My eye was caught recently by this article on the BBC News channel. It seems some, no doubt well-meaning,* Head thought it a good idea to create what was in effect a league table of pupils’ ability by displaying their names and photos on a classroom wall along with how they ranked based on academic performance. It seems from the article that without doing this they felt they would be unable to provide individual support to allow pupils to improve. Who am I to argue with that?

Whilst it seems a large number of parents and pupils, possibly I suspect those in the relegation zones, thought this was inappropriate and possibly counter-productive no-one seems to have queried** whether the school had a right to do this. This intrigued me because it had parallels with a blog post I wrote a couple of years ago but seem to have forgotten to publish. More of that later.

Clearly the school is processing personal data. There is no suggestion that this has been done by consent of the pupils. I will skip over whether there is a breach of confidentiality involved as that is rather a difficult topic. But data protection law is rather easier.

Was what the school did fair? A significant number of those involved appear to think not, which is usually a good indicator. A pity perhaps that the school was not in Clapham (it’s in remotest Northumberland) then we could cross check to see how many of the objectors went to school by bus to clinch the question. If not fair Data Protection Principle 1 is breached.

What condition in Schedule 2 of the Data Protection Act can the school pray in aid of their actions in the absence of consent? I can ignore the unknown factor of any attempt to imply consent from the school’s prospectus or terms here as that would not be freely given informed consent. The only possibility is the well worn condition 6. Let us set it out – my emphasis:

The processing is necessary for the purposes of legitimate interests pursued by the data controller or by the third party or parties to whom the data are disclosed, except where the processing is unwarranted in any particular case by reason of prejudice to the rights and freedoms or legitimate interests of the data subject. [note singular case]

The school clearly has a legitimate interest in raising standards. For argument’s sake let us give them the benefit of the doubt that they have sufficient evidence that “transparent communication” works to do that at a school level – although personally it seems a bit flaky and I would like to know more about how and when (timing seems to be relevant) it was done in those schools were it was said to have been a successful strategy.

What is far more doubtful is whether it is “necessary” to do this. The stated aim is to ensure pupils are “aware of their relative-ability levels prior to entering examinations”. If that is true I have little doubt it could be achieved in another way. The necessity test fails and this would be a data protection breach.

I strongly suspect however that the stated aim is a euphemism for the true aim which is hard to state in a PC world. I strongly suspect that children are pretty well aware of their relative standing (they certainly were in my class!). No the point here is that awareness is not enough. We need to name and shame and embarrass them into pulling their socks up. If that is the true aim perhaps necessity falls into place – although fairness still seems iffy. And if it works for Charles and Charlotte the means may be justified by the end and the processing is possibly not unwarranted. But will it work for everyone? Probably not and that causes a fundamental difficulty. Condition 6 (and probably many other parts of the law) requires the school to consider the position of each data subject affected. If this exercise has the opposite effect of demotivating or causing other difficulties for any pupils then, considering those pupils, surely the processing is unwarranted. Or perhaps we are in Spock’s realm of the needs of the many outweighing the needs of the one. I’m not convinced data protection is meant to work that way, particularly where children are concerned.

In summary I think that there is a strong argument that what this school did was in breach of principle 1 – unfair and no schedule 2 condition satisfied. Anyone care to tell the distressed kids, they may have a claim for damages?

My other post, which must await another day as I have gone on far too long, was about exam results.

*I can tell they were well-meaning because they us all the right words like “innovative strategies”, “group-learning classroom***” and “transparent communication****”.

** Apologies if this has been picked up on the DP twitter-sphere which I am currently gracing with my absence.

*** Better for kids than solitary-learning cells.

**** (running out of star space soon) Presumably they previously communicated in code without giving out the code-book.

Posted in DPA | Tagged , | Leave a comment

Volunteers and Data Protection

The Information Commissioner recently published a report  following a series of visits to Victims’ Services Alliance Organisations. These are typically charities but the report is also aimed at non-charitable volutary organisations.

One issue touched is the question of identifying who is a data controller and who is a data processor. ICO found that service agreements did not always refer to data protection, information security or any records management procedures. It was sometimes unclear, in terms of the DPA, who was the data controller or data processor and what would happen to the personal data they are holding should the relationship with the organisation break down, or the agreement be terminated. ICO recommended:

Organisations should refer to the ICO’s data controllers and data processors guidance which explains the difference between both categories and the implications for the organisations concerned. Once the relationship has been determined, this should be formally documented along with the relevant roles and responsibilities. Create a counsellor’s agreement which covers, as a minimum, the security provisions that are expected for any documents containing personal data about a client, and the response times that are expected to be met for any requests for information from management.

Unfortunately, as practitioners well know, deciding what is what in this area, is often very difficult, and involves interpreting the definitions of data controller and processor in section 1 of the DPA. This is likely to become an increasingly important issue as some public services are ‘contracted out’ to charities and other voluntary organisations.

The controller is the person who “who (either alone or jointly or in common with other persons) determines the purposes for which and the manner in which any personal data are, or are to be, processed;” and the processor is someone other than an employee of the data controller, who processes the data on behalf of the data controller”. To make matters worse, in some situations a person may be both controller and processor.

There are some clear pitfalls to avoid, and these will be considered in the light of a fairly typical situation of a Local Authority handing over the running of a local library to a voluntary community organisation. Under the 1964 Public Libraries and Museums Act, councils have a duty to provide a ‘comprehensive and efficient’ public library service. Running a library service inevitably involves handling personal data of customers. Whilst this may well not technically be ‘sensitive personal data’ it can have sensitive and confidential implications: type of book borrowed, websites visited on library computers etc.

As this is a statutory function, the Council is often likely to remain data controller in relation to the customer data. It remains the Council’s purpose, and in most cases it will still be run to Council processes. The voluntary organisation (“VO”) would then be a data processor, with all the implications for the Council & VO under principle 7 including:

  • appropriate contractual terms dealing with DP responsibilities
  • due diligence by Council in appointing and monitoring the running of the service
  • VO responsible for taking appropriate technical and organisational measures against unauthorised or unlawful processing of personal data and against accidental loss or destruction of, or damage to the personal data including vetting and training ‘staff’ – although the agreement could of course pass this back to the Council in reality

In such an arrangement, where the VO is purely a data processor, it is probably not necessary (at least in DP compliance terms) that the Council obtains customer consent for the new arrangement, although the requirement of fairness (and political reality) suggests that they should be told.

However it is possible to imagine other ways of farming out this service where the VO actually has some leeway to provide new and innovative services outside the core duties imposed on the Council, which it would determine and manage for itself. In such a case the VO could become a data controller in relation to using personal data for these purposes. In this case there would almost certainly need to be a renewal of customer consent which would have to be managed carefully before the VO could use the personal data.

Great care must also be taken over the status of the ‘staff’ themselves. There is a difference, not often recognised, between pure volunteers, and ‘voluntary workers’. Because of the definition in DPA. a pure volunteer, i.e. someone who does not have any form of contract of employment or contract to perform work or provide services and is under no obligation to perform work or carry out instructions and can come and go as they please with no expectation of any reward, would have to be regarded as a data processor in his or her own right, as they would not be an ’employee’ of the VO. It would be almost impossible for either Council or VO to comply with its data protection obligations if such a volunteer had access to personal data.

The VO should therefore employ “voluntary workers” with a formal contract within the meaning of s44 of the National Minimum Wage Act. This would be sufficient to make them an ’employee’ in the terms of the DPA definition, and the contracts should have appropriate confidentiality clauses. The VO would then take vicarious liability for the acts of the volunteers.

Posted in DPA, FOI | Leave a comment

Don’t Expect ICO to argue your case

When applying to the Information Commissioner under s50 FOIA to overturn a refusal by a public authority it is important to marshal all your arguments and understand what is available to you.

This was illustrated in a decision in September , case number FS50546586 where the requester sought information from the Ministry of Justice relating to the Court Proceedings Database including the names of offenders found guilty of offences under the Housing Act 2004 held on that database. Disclosure was refused by the MoJ on the basis of the personal data exemption.

Not surprisingly ICO overturned this in relation to those convicted who were companies, as it was not personal data, and the applicant was quite pleased with his partial victory. See http://bit.ly/1wsiE0c

But what of the guilty who were private individuals. ICO correctly found that in their case the information requested was sensitive personal data. Such data cannot be released if to do so would breach the data protection principles and here we are concerned with Principle 1 : “Personal data shall be processed fairly and lawfully and, in particular, shall not be processed unless— (a) at least one of the conditions in Schedule 2 is met, and (b)in the case of sensitive personal data, at least one of the conditions in Schedule 3 is also met.

The ICO carefully considered fairness in relation to the individuals and concluded that disclosure would indeed be fair. He then looked for a Schedule 3 condition and considered the only possible candidates : 1. explicit consent and 5. information  made public as a result of steps deliberately taken by the data subject. He was satisfied that neither of these applied and that was the end of the matter.

The Commissioner must uphold the MoJ’s application of the exemption at section 40(2) in respect of the sensitive personal data in this case. He does so not on the basis that disclosure would be unfair but on the basis that there is no applicable Schedule 3 condition. The personal data is therefore exempt from disclosure.

But surely he should have considered Para 3 of the Schedule to The Data Protection (Processing of Sensitive Personal data) Order 2000, of which I suspect the applicant was blissfully unaware.

The disclosure of personal data— (a)is in the substantial public interest; (b)is in connection with— (i)the commission by any person of any unlawful act … (c)is for the special purposes as defined in section 3 of the Act (i.e. journalism) ; and (d)is made with a view to the publication of those data by any person and the data controller reasonably believes that such publication would be in the public interest.

Given the basis of ICO’s decision on fairness it must have been very strongly arguable that this could properly be applied – applicant was a journalist after all. This has been used successfully in the past see e.g. discussion of the Nick Griffin case on the excellent Panopticon blog http://bit.ly/1wsgayP

And if a Schedule 3 condition ( in the extended sense) could be found it is not hard to find a suitable condition in Schedule 2

Posted in FOI | Leave a comment