Privacy By Design - Interdisciplinary Privacy Course
This interdisciplinary course is part of the thematic training of the Leuven Arenberg Doctoral School Training Programme. The course is mainly aimed at Ph.D. students from all disciplines (either from the K.U.Leuven or from other universities), but also open to undergraduate students, post-docs, people working in industry, or anyone else interested on the topic.
In a series of lectures, the course will provide an overview of various aspects of privacy from the technical, legal, and social science perspectives. This year’s edition of the course will have a special focus
on privacy by design, web search services, and behavioral advertising.
Privacy By Design - an important challenge for the Security Industry at Large
In addition to the lectures, this year’s course will feature interactive exercise sessions in which the participants will work in groups. In these exercise sessions the participants will apply what they learn in the lectures to a practical case study (web search application). The participants will be asked to identify the stakeholders and their requirements, define the functionality of the system, select the technologies that would be implemented in the design, and discuss the legal and societal aspects of the system. The participation in all the sessions is required in order to obtain the certificate of attendance to the course.
- Wednesday, June 27, from 9:15 to 17:00
- Thursday, June 28, from 9:00 to 17:15
- Friday, June 29, from 9:15 to 16:30
Lecture room: MTC1 02.07
Interactive exercise sessions rooms: MTC1 00.07 and MTC1 00.16
3000 Leuven (Belgium)
- Claudia Diaz (KU Leuven ESAT/COSIC)
- Seda Gürses (KU Leuven ESAT/COSIC)
- Eleni Kosta (KU Leuven Law/ICRI)
- Bettina Berendt (KU Leuven CS/DTAI)
- Jo Pierson (VUB IBBT-SMIT)
- Invited speaker(s) - TBA
- The course is free of charge, but attendees are required to register by sending an email to email@example.com
- The course will provide coffee breaks for the participants. Lunches are not provided. A number of restaurants are in the vicinity of the course venue.
- The registration deadline is: Tuesday, June 20, 2012
* Web page
Wed June 27
09:15 - 09:30 Welcome coffee
09:30 - 10:15 Lecture 1: Introduction (Claudia Diaz)
10:15 - 11:15 Lecture 2: Addressing Surveillance and Privacy during Requirements Engineering:
The challenge of search and behavioral advertising (Seda Gürses)
11:15 - 11:40 Coffee break
11:40 - 12:30 Explanation of the practical exercise (Seda Gürses)
12:30 - 14:00 Lunch break
14:00 - 15:15 Exercise session 1
15:15 - 15:40 Coffee break
15:40 - 17:00 Exercise session 2
Thu June 28
09:00 - 09:15 Welcome coffee
09:15 - 10:15 Lecture 3: Web mining and privacy: threats, opportunities, and design issues (Bettina Berendt)
10:15 - 11:15 Lecture 4: Social perspective on (dis)empowerment of users in an internet environment (Jo Pierson)
11:15 - 11:35 Coffee break
11:35 - 12:35 Lecture 5: Technologies for private search (Claudia Diaz)
12:35 - 14:00 Lunch break
14:00 - 15:00 Lecture 6: (Re)introducing privacy by design: the realm of search engines (Eleni Kosta)
15:15 - 15:35 Coffee break
15:35 - 17:15 Exercise session 3
Fri June 29
09:15 - 09:30 Welcome coffee
09:30 - 11:00 Invited talk (tba)
11:00 - 11:20 Coffee break
11:20 - 12:30 Exercise session 4
12:30 - 14:00 Lunch break
14:00 - 15:00 Exercise session 5: preparation of presentations
15:00- 15:20 Coffee break
15:20 - 16:30 Presentations of results of the exercise and discussion
Lecture 1: Introduction (by Claudia Diaz)
This lecture will motivate the need for privacy protection, introduce the arguments in the privacy debate, and review the main approaches to privacy. Some of the questions that we will address in this talk include: Why is privacy important? Why is it so complex? What are the different meanings of “privacy”? How does “privacy” translate to technical properties and how do these relate to classical security properties?
Lecture 2: Addressing Surveillance and Privacy during Requirements Engineering: The challenge of search and behavioral advertising (by Seda Gürses)
Privacy is a debated notion with various definitions that are also often vague. While this increases the resilience of the privacy concept in social and legal context, it poses a considerable challenge to defining the privacy problem and the appropriate solutions to address those problems in a system-to-be. Surveillance can be summed up as “any collection and processing of personal data, whether identifiable or not, for the purposes of influencing or managing those whose data have been garnered” (Lyon, 2001). One of the main concerns with any type of surveillance is social sorting, a form of classifying people based on surveillance data that may lead to real effects on the life-chances of people. In the context of web-based search, given its current integration with targeted and behavioral advertisement, different parties raise concerns with respect to privacy and surveillance. From an engineering perspective this raises questions about whether and how these matters can be addressed when engineering information systems? Ideally, when engineering systems, the stakeholders of the system step through a process of reconciling the relevant privacy and surveillance definitions and the (technical) privacy solutions in the given social context. We will explore methods to define and elicit concerns based on different privacy and surveillance notions; summarize the desired steps of a multilateral requirements analysis approach; and discuss how these methods can be applied in the context of web based search and behavioral advertising.
Lyon, D. (2001). Surveillance society: Monitoring everyday life. Buckingham, UK: Open University Press.
Lecture 3: Web mining and privacy: threats, opportunities, and design issues (by Bettina Berendt)
Web mining is the application of data mining techniques on Web data such as queries and other records of usage, social-network profiles and friend links, or news, blogs and tweets. Data mining means finding new knowledge that was previously only implicit in data. Web mining thus operates on many personal data that keep growing in volume and interrelatedness, and it0leads to inferences on inferences and groups that may be beneficial for some but unwanted-to-pernicious for others.
In this lecture, I will first give an overview of mining techniques and typical uses such as profiling. I will then describe methods that have been proposed for protecting personal data from unwanted inferences (privacy-preserving data mining) or for reducing the risks of releasing these data (privacy-preserving data publishing). I will investigate the roles in the mining process (who is doing the mining on whose data of what sorts) and identify threats and opportunities in different settings that range from business intelligence to feedback and awareness tools for user empowerment. I will conclude with thoughts on what “privacy by design” may mean in the context of Web mining.
Lecture 4: Social perspective on (dis)empowerment of users in an internet environment (by Jo Pierson)
In a society where people increasing rely on search engines and social media for communication and information sharing, it is vital to investigate these new forms of mediated communication from the social perspective of users/citizens/consumers. However in this transitional digital media ecosystem we observe how people can become simultaneously empowered as well as disempowered, in particularly on the levels of identity, privacy and surveillance. How this works out depends on the interrelationship between how internet systems are being designed (i.e. what they enable) and what people within their social context do with these systems (i.e. are able to do). In this way we notice for example that users of search engines and social media are foremost framed as consumers, and where ‘relevance’ is foremost posited as ‘commercial relevance’. Questions are therefore: How can governance and power manifest itself through the algorithm? To what extent and how are the social practices by citizens and communities following, opposing and/or negotiating the ‘governance’ of internet systems? In what ways is the social self increasingly being commodified, with personal data becoming the new currency? In what way can a socio-technological perspective offer solutions?
Lecture 5: Technologies for private search (by Claudia Diaz)
Search queries are closely related to the issues on which we are interested. This raises privacy concerns, as potentially sensitive information can be inferred from these queries, such as income level, health issues, or political beliefs. In this talk we will review different technologies for implementing private search services. This includes cryptographic techniques such as private information retrieval, as well as obfuscation-based private web search based on automatically generating fake queries.
Lecture 6: (Re)introducing privacy by design: the realm of search engines (by Eleni Kosta)
Building legally compliant systems that process personal information is turning into a nightmare for online business. The quest for finding the balance between the privacy of the users on the one hand, and the maximization of the profit of online business, usually deriving from the processing of user information, on the other, proves to be a difficult task. This lecture will present the initiatives of the European Commission in the frame of the reform of the European Data Protection Directive to achieve such a balance. The case of search engines, who collect and process vast amounts of use information is going to be used as an example.
*** Interactive exercise sessions
Exercise session 1
In this session the students will identify the stakeholders and describe their interests and stakes in the system. This will include: their incentives, their interests, and the identification of potential conflicts between their interests.
Exercise session 2
In this session the students will specify the functionality, domain, and trust assumptions of the system. They also construct an initial model of the information that is necessary to fulfill the functionality of the system.
Exercise session 3
In this session the participants will identify the legal frameworks that apply, describe the legal roles and responsibilities of the stakeholders and their data protection requirements, and discuss the societal implications of the system linked to power relations between different stakeholders. They will also conduct an analysis of the privacy concerns of the stakeholders and the service integrity guarantees (i.e., threat and security analysis).
Exercise session 4
In this session the participants will further refine the definition of privacy goals and provide suggestions for privacy technologies that could be used in the system. The participants are asked to apply some of the things they learned in the lectures to the system they are developing. The specific choices of technical solutions to be used in the system will require re-thinking of the applicability of legal frameworks, the concrete functionality and the information model.
Exercise session 5
In this session the participants will consolidate their conclusions and prepare the presentation for the rest of the course participants that will take place in the last session of the course.
Are you a
leader in Security ? Do you want to share your expertise and join the
Leaders in Security as a Core Expert Member ?
Contact us via email! Or call +188.8.131.52.41 for a direct contact and more information.
An information set and your Membership Welcome Pack awaits you.
Copyright LSEC vzw 2007-2008 with the support of the IWT.
LSEC vzw Kasteelpark 10 - 3001 Heverlee - VAT BE BE 478 045 395 - fax. +184.108.40.206.69 - info @ lsec.be