Climate change is one of the key societal challenges of our times, and its debate takes place across scientific disciplines and into the public realm, traversing platforms, sources, and fields of study. The analysis of such mediated debates has a strong tradition, which started in communication science and has since then been applied across a wide range of academic disciplines.So-called ‘content analysis’ provides a means to study (mass) media content in many media shapes and formats to retrieve signs of the zeitgeist, such as cultural phenomena, representation of certain groups, and the resonance of political viewpoints. In the era of big data and digital culture, in which websites and social media platforms produce massive amounts of content and network this through hyperlinks and social media buttons, content analysis needs to become adaptive to the many ways in which digital platforms and engines handle content.This book introduces Networked Content Analysis as a digital research approach, which offers ways forward for students and researchers who want to work with digital methods and tools to study online content. Besides providing a thorough theoretical framework, the book demonstrates new tools and methods for research through case studies that study the climate change debate with search engines, Twitter, and the encyclopedia project of Wikipedia.
MULTIFILE
See Springer link - available under Open Access
LINK
With summaries in Dutch, Esperanto and English. DOI: 10.4233/uuid:d7132920-346e-47c6-b754-00dc5672b437 "The subject of this study is deformation analysis of the earth's surface (or part of it) and spatial objects on, above or below it. Such analyses are needed in many domains of society. Geodetic deformation analysis uses various types of geodetic measurements to substantiate statements about changes in geometric positions.Professional practice, e.g. in the Netherlands, regularly applies methods for geodetic deformation analysis that have shortcomings, e.g. because the methods apply substandard analysis models or defective testing methods. These shortcomings hamper communication about the results of deformation analyses with the various parties involved. To improve communication solid analysis models and a common language have to be used, which requires standardisation.Operational demands for geodetic deformation analysis are the reason to formulate in this study seven characteristic elements that a solid analysis model needs to possess. Such a model can handle time series of several epochs. It analyses only size and form, not position and orientation of the reference system; and datum points may be under influence of deformation. The geodetic and physical models are combined in one adjustment model. Full use is made of available stochastic information. Statistical testing and computation of minimal detectable deformations is incorporated. Solution methods can handle rank deficient matrices (both model matrix and cofactor matrix). And, finally, a search for the best hypothesis/model is implemented. Because a geodetic deformation analysis model with all seven elements does not exist, this study develops such a model.For effective standardisation geodetic deformation analysis models need: practical key performance indicators; a clear procedure for using the model; and the possibility to graphically visualise the estimated deformations."
Despite the benefits of the widespread deployment of diverse Internet-enabled devices such as IP cameras and smart home appliances - the so-called Internet of Things (IoT) has amplified the attack surface that is being leveraged by cyber criminals. While manufacturers and vendors keep deploying new products, infected devices can be counted in the millions and spreading at an alarming rate all over consumer and business networks. The objective of this project is twofold: (i) to explain the causes behind these infections and the inherent insecurity of the IoT paradigm by exploring innovative data analytics as applied to raw cyber security data; and (ii) to promote effective remediation mechanisms that mitigate the threat of the currently vulnerable and infected IoT devices. By performing large-scale passive and active measurements, this project will allow the characterization and attribution of compromise IoT devices. Understanding the type of devices that are getting compromised and the reasons behind the attacker’s intention is essential to design effective countermeasures. This project will build on the state of the art in information theoretic data mining (e.g., using the minimum description length and maximum entropy principles), statistical pattern mining, and interactive data exploration and analytics to create a casual model that allows explaining the attacker’s tactics and techniques. The project will research formal correlation methods rooted in stochastic data assemblies between IoT-relevant measurements and IoT malware binaries as captured by an IoT-specific honeypot to aid in the attribution and thus the remediation objective. Research outcomes of this project will benefit society in addressing important IoT security problems before manufacturers saturate the market with ostensibly useful and innovative gadgets that lack sufficient security features, thus being vulnerable to attacks and malware infestations, which can turn them into rogue agents. However, the insights gained will not be limited to the attacker behavior and attribution, but also to the remediation of the infected devices. Based on a casual model and output of the correlation analyses, this project will follow an innovative approach to understand the remediation impact of malware notifications by conducting a longitudinal quasi-experimental analysis. The quasi-experimental analyses will examine remediation rates of infected/vulnerable IoT devices in order to make better inferences about the impact of the characteristics of the notification and infected user’s reaction. The research will provide new perspectives, information, insights, and approaches to vulnerability and malware notifications that differ from the previous reliance on models calibrated with cross-sectional analysis. This project will enable more robust use of longitudinal estimates based on documented remediation change. Project results and methods will enhance the capacity of Internet intermediaries (e.g., ISPs and hosting providers) to better handle abuse/vulnerability reporting which in turn will serve as a preemptive countermeasure. The data and methods will allow to investigate the behavior of infected individuals and firms at a microscopic scale and reveal the causal relations among infections, human factor and remediation.