The use of biometric monitoring allow researchers insight into the processing of environmental data by our central nervous systems. As a result we can determine precisely which stimuli cause arousal or draw our attention. This technology is used widely by commercial interests but is not commonly used to improve the public realm. Our authors hope to change this.
LINK
Airports have undergone a significant digital evolution over the past decades, enhancing efficiency, effectiveness, and user-friendliness through various technological advancements. Initially, airports deployed basic IT solutions as support tools, but with the increasing integration of digital systems, understanding the detailed digital ecosystem behind airports has become crucial. This research aims to classify technological maturity in airports, using the access control process as an example to demonstrate the benefits of the proposed taxonomy. The study highlights the current digital ecosystem and its future trends and challenges, emphasizing the importance of distinguishing between different levels of technological maturity. The role of biometric technology in security access control is examined, highlighting the importance of proper identification and classification. Future research could explore data collection, privacy, and cybersecurity impacts, particularly regarding biometric technologies in Smart Access Level 4.0. The transition from Smart Access Level 3.0 to 4.0 involves process automation and the introduction of AI, offering opportunities to increase efficiency and improve detection capabilities through advanced data analytics. The study underscores the need for global legislative frameworks to regulate and support these technological advancements.
DOCUMENT
Recent years have seen a massive growth in ethical and legal frameworks to govern data science practices. Yet one of the core questions associated with ethical and legal frameworks is the extent to which they are implemented in practice. A particularly interesting case in this context comes to public officials, for whom higher standards typically exist. We are thus trying to understand how ethical and legal frameworks influence the everyday practices on data and algorithms of public sector data professionals. The following paper looks at two cases: public sector data professionals (1) at municipalities in the Netherlands and (2) at the Netherlands Police. We compare these two cases based on an analytical research framework we develop in this article to help understanding of everyday professional practices. We conclude that there is a wide gap between legal and ethical governance rules and the everyday practices.
MULTIFILE
In this project we take a look at the laws and regulations surrounding data collection using sensors in assistive technology and the literature on concerns of people about this technology. We also look into the Smart Teddy device and how it operates. An analysis required by the General Data Protection Regulation (GDPR) [5] will reveal the risks in terms of privacy and security in this project and how to mitigate them. https://nl.linkedin.com/in/haniers
MULTIFILE
Lives of Data maps the historical and emergent dynamics of big data, computing, and society in India. Data infrastructures are now more global than ever before. In much of the world, new sociotechnical possibilities of big data and artificial intelligence are unfolding under the long shadows cast by infra/structural inequalities, colonialism, modernization, and national sovereignty. This book offers critical vantage points for looking at big data and its shadows, as they play out in uneven encounters of machinic and cultural relationalities of data in India’s socio-politically disparate and diverse contexts.Lives of Data emerged from research projects and workshops at the Sarai programme, Centre for the Study of Developing Societies. It brings together fifteen interdisciplinary scholars and practitioners to set up a collaborative research agenda on computational cultures. The essays offer wide-ranging analyses of media and techno-scientific trajectories of data analytics, disruptive formations of digital economy, and the grounded practices of data-driven governance in India. Encompassing history, anthropology, science and technology studies (STS), media studies, civic technology, data science, digital humanities, and journalism, the essays open up possibilities for a truly situated global and sociotechnically specific understanding of the many lives of data.
MULTIFILE
Through artistic interventions into the computational backbone of maternity services, the artists behind the Body Recovery Unit explore data production and its usages in healthcare governance. Taking their artwork The National Catalogue Of Savings Opportunities. Maternity, Volume 1: London (2017) as a case study, they explore how artists working with ‘live’ computational culture might draw from critical theory, Science and Technology Studies as well as feminist strategies within arts-led enquiry. This paper examines the mechanisms through which maternal bodies are rendered visible or invisible to managerial scrutiny, by exploring the interlocking elements of commissioning structures, nationwide information standards and databases in tandem with everyday maternity healthcare practices on the wards in the UK. The work provides a new context to understand how re-prioritisation of ‘natural’ and ‘normal’ births, breastfeeding, skin-to-skin contact, age of conception and other factors are gaining momentum in sync with cost-reduction initiatives, funding cuts and privatisation of healthcare services.
MULTIFILE
The growing availability of data offers plenty of opportunities for data driven innovation of business models for SMEs like interactive media companies. However, SMEs lack the knowledge and processes to translate data into attractive propositions and design viable data-driven business models. In this paper we develop and evaluate a practical method for designing data driven business models (DDBM) in the context of interactive media companies. The development follows a design science research approach. The main result is a step-by-step approach for designing DDBM, supported by pattern cards and game boards. Steps consider required data sources and data activities, actors and value network, revenue model and implementation aspects. Preliminary evaluation shows that the method works as a discussion tool to uncover assumptions and make assessments to create a substantiated data driven business model.
MULTIFILE
Completeness of data is vital for the decision making and forecasting on Building Management Systems (BMS) as missing data can result in biased decision making down the line. This study creates a guideline for imputing the gaps in BMS datasets by comparing four methods: K Nearest Neighbour algorithm (KNN), Recurrent Neural Network (RNN), Hot Deck (HD) and Last Observation Carried Forward (LOCF). The guideline contains the best method per gap size and scales of measurement. The four selected methods are from various backgrounds and are tested on a real BMS and metereological dataset. The focus of this paper is not to impute every cell as accurately as possible but to impute trends back into the missing data. The performance is characterised by a set of criteria in order to allow the user to choose the imputation method best suited for its needs. The criteria are: Variance Error (VE) and Root Mean Squared Error (RMSE). VE has been given more weight as its ability to evaluate the imputed trend is better than RMSE. From preliminary results, it was concluded that the best K‐values for KNN are 5 for the smallest gap and 100 for the larger gaps. Using a genetic algorithm the best RNN architecture for the purpose of this paper was determined to be GatedRecurrent Units (GRU). The comparison was performed using a different training dataset than the imputation dataset. The results show no consistent link between the difference in Kurtosis or Skewness and imputation performance. The results of the experiment concluded that RNN is best for interval data and HD is best for both nominal and ratio data. There was no single method that was best for all gap sizes as it was dependent on the data to be imputed.
MULTIFILE
Moving away from the strong body of critique of pervasive ‘bad data’ practices by both governments and private actors in the globalized digital economy, this book aims to paint an alternative, more optimistic but still pragmatic picture of the datafied future. The authors examine and propose ‘good data’ practices, values and principles from an interdisciplinary, international perspective. From ideas of data sovereignty and justice, to manifestos for change and calls for activism, this collection opens a multifaceted conversation on the kinds of futures we want to see, and presents concrete steps on how we can start realizing good data in practice.
MULTIFILE