In this paper, we experimentally compare orthogonal frequency-division multiplexing (OFDM) and on-off keying (OOK) modulation in the context of the IEEE 802.15.13-2023 standard at bandwidths up to 50 MHz across a Li-Fi link with distances up to 5 m and a lateral offset up to 51°. Error vector magnitude (EVM) and bit error rate (BER) evaluations confirm that the high peak-to-average power ratio (PAPR) of OFDM limits the achievable transmission distance, but it offers higher data rates due to its higher spectral efficiency. Due to the lower PAPR, OOK-based Pulsed Modulation PHY (PM-PHY) shows a significantly higher link range. As the structure of the PM-PHY is based on OFDM symbols, the two solutions may also be combined to open a wider range of use cases for optical wireless communications.
LINK
Background: Lexical access problems of inflected verbs are common in aphasia. Previous research addressed these problems either in purely linguistic terms (e.g., verb movement) or in terms of lexical characteristics (e.g., frequency). We propose a new measure of verb complexity, which combines linguistic and lexical characteristics and is formulated in terms of Shannon’s information theory. Aims: We aim to explore the complexity of individual verbs and verb paradigms and its effect on lexical access, both in unimpaired people and people with aphasia (PWA). We apply information theory to investigate the impact of verb complexity on reaction time (RT) for lexical decision. Methods & Procedures: 20 non-fluent aphasic subjects and 11 age-matched and education-matched peers performed an auditory lexical decision task containing 286 real and 286 phonotactically legal non-word past tense forms (regulars and irregulars). RTs and error rates were measured. Two information-theoretic measures were calculated: inflectional entropy (reflecting probabilistic variability of forms within a given verbal family) and information load (I) (reflecting complexity of an individual verb form). The effect for these and other more traditional measures on RT were measured. Outcomes & Results: Linear mixed model analyses to the data for each group with participant and verb as crossed random effects were performed. Results show that for all groups inflectional entropy had a facilitatory effect on RT. There was a group effect for inflectional entropy indicating that for the patients with aphasia the effect of inflectional entropy was less pronounced. At the same time, I did correlate with latencies for healthy adults but not for individuals with aphasia. Conclusions: Our results demonstrate that the decrease in lexical processing capacity characteristic for PWA has a measurable effect that can be calculated using information theoretical means. According to our model, these individuals have particular difficulties with processing lexical items of higher complexity, as measured by individual I, and benefit less from the support normally provided (in comprehension) by other members of the corresponding lexical network. Finally, the proposed information-theoretic complexity measures, which encompass both frequency effects and linguistic parameters, provide a superior measure of lexical access, and have a better explanatory power for the analyses of access problems found in non-fluent aphasia, compared to analyses based on frequency only.
LINK
The nonlinearity induced by light-emitting diodes in visible light communication (VLC) systems presents a challenge to the parametrization of orthogonal frequency division multiplexing (OFDM). The goal of the multi-objective optimization problem presented in this study is to maximize the transmitted power (superimposed LED bias-current and signal amplification) for both conventional and constant envelope (CE) OFDM while also maximizing spectral efficiency. The bit error rate (BER) metric is used to evaluate the optimization using the non-dominated sorting genetic algorithm II. Simulation results show that for a BER of 1×10 −3 , the signal-to-noise ratio (SNR) required decreases with the guard band due to intermodulation distortions. In contrast to SNR values of approximately 13 and 25 dB achieved by traditional OFDM-based systems, the VLC system with CE signals achieves a guard band of 6% of the signal bandwidth with required SNR values of approximately 10.8 and 24 dB for 4-quadrature amplitude modulation (QAM) and 16-QAM modulation orders, respectively.
DOCUMENT
A low-cost sensornode is introduced to monitor the 5G EMF exposure in the Netherlands for the four FR1 frequency bands. The sensornode is validated with in-lab measurements both with CW signals as for QAM signals and perform for both cases and for all frequency bands an error less than 1 dB for a dynamic range of 40 dB. This sensor is a follow up of the earlier version of our previously developed sensor and have substantial improvements in terms of linearity, error, and stability.
DOCUMENT
Many quality aspects of software systems are addressed in the existing literature on software architecture patterns. But the aspect of system administration seems to be a bit overlooked, even though it is an important aspect too. In this work we present three software architecture patterns that, when applied by software architects, support the work of system administrators: PROVIDE AN ADMINISTRATION API, SINGLE FILE LOCATION, and CENTRALIZED SYSTEM LOGGING. PROVIDE AN ADMINISTRATION API should solve problems encountered when trying to automate administration tasks. The SINGLE FILE LOCATION pattern should help system administrators to find the files of an application in one (hierarchical) place. CENTRALIZED SYSTEM LOGGING is useful to prevent coming up with several logging formats and locations. Abstract provided by the authors. Published in PLoP '13: Proceedings of the 20th Conference on Pattern Languages of Programs ACM.
DOCUMENT
1. We assessed the hypothesized negative correlation between the influence of multiple predators and body condition and fecundity of the European hare, from 13 areas in the Netherlands. 2. Year-round abundance of predators was estimated by hunters. We quantified predator influence as the sum of their field metabolic rates, as this sum reflects the daily food requirements of multiple individuals. We determined the ratio between body mass and hindfoot length of hares as an index of body condition and the weight of their adrenal gland as a measure of chronic exposure to stress, and we counted the number of placental scars to estimate fecundity of hares. 3. As hypothesized, we found that the sum of field metabolic rate of predators was negatively correlated with body condition and the number of placental scars, whereas it was positively related to the weight of the adrenal glands. In contrast to the sum of the field metabolic rate, the total number of predators did not or weakly affect the investigated risk responses. 4. The sum of the field metabolic rate can be a useful proxy for the influence of multiple predators and takes into account predator abundance, type, body weight, and food requirements of multiple predators. 5. With our findings, our paper contributes to a better understanding of the risk effects of multiple predators on prey fitness. Additionally, we identify a potential contributor to the decline of European hare populations.
MULTIFILE
Gamma-band neuronal synchronization during sentence-level language comprehension has previously been linked with semantic unification. Here, we attempt to further narrow down the functional significance of gamma during language comprehension, by distinguishing between two aspects of semantic unification: successful integration of word meaning into the sentence context, and prediction of upcoming words. We computed eventrelated potentials (ERPs) and frequency band-specific electroencephalographic (EEG) power changes while participants read sentences that contained a critical word (CW) that was (1) both semantically congruent and predictable (high cloze, HC), (2) semantically congruent but unpredictable (low cloze, LC), or (3) semantically incongruent (and therefore also unpredictable; semantic violation, SV). The ERP analysis showed the expected parametric N400 modulation (HC < LC < SV). The time-frequency analysis showed qualitatively different results. In the gamma-frequency range, we observed a power increase in response to the CW in the HC condition, but not in the LC and the SV conditions. Additionally, in the theta frequency range we observed a power increase in the SV condition only. Our data provide evidence that gamma power increases are related to the predictability of an upcoming word based on the preceding sentence context, rather than to the integration of the incoming word's semantics into the preceding context. Further, our theta band data are compatible with the notion that theta band synchronization in sentence comprehension might be related to the detection of an error in the language input.
MULTIFILE
The built environment requires energy-flexible buildings to reduce energy peak loads and to maximize the use of (decentralized) renewable energy sources. The challenge is to arrive at smart control strategies that respond to the increasing variations in both the energy demand as well as the variable energy supply. This enables grid integration in existing energy networks with limited capacity and maximises use of decentralized sustainable generation. Buildings can play a key role in the optimization of the grid capacity by applying demand-side management control. To adjust the grid energy demand profile of a building without compromising the user requirements, the building should acquire some energy flexibility capacity. The main ambition of the Brains for Buildings Work Package 2 is to develop smart control strategies that use the operational flexibility of non-residential buildings to minimize energy costs, reduce emissions and avoid spikes in power network load, without compromising comfort levels. To realise this ambition the following key components will be developed within the B4B WP2: (A) Development of open-source HVAC and electric services models, (B) development of energy demand prediction models and (C) development of flexibility management control models. This report describes the developed first two key components, (A) and (B). This report presents different prediction models covering various building components. The models are from three different types: white box models, grey-box models, and black-box models. Each model developed is presented in a different chapter. The chapters start with the goal of the prediction model, followed by the description of the model and the results obtained when applied to a case study. The models developed are two approaches based on white box models (1) White box models based on Modelica libraries for energy prediction of a building and its components and (2) Hybrid predictive digital twin based on white box building models to predict the dynamic energy response of the building and its components. (3) Using CO₂ monitoring data to derive either ventilation flow rate or occupancy. (4) Prediction of the heating demand of a building. (5) Feedforward neural network model to predict the building energy usage and its uncertainty. (6) Prediction of PV solar production. The first model aims to predict the energy use and energy production pattern of different building configurations with open-source software, OpenModelica, and open-source libraries, IBPSA libraries. The white-box model simulation results are used to produce design and control advice for increasing the building energy flexibility. The use of the libraries for making a model has first been tested in a simple residential unit, and now is being tested in a non-residential unit, the Haagse Hogeschool building. The lessons learned show that it is possible to model a building by making use of a combination of libraries, however the development of the model is very time consuming. The test also highlighted the need for defining standard scenarios to test the energy flexibility and the need for a practical visualization if the simulation results are to be used to give advice about potential increase of the energy flexibility. The goal of the hybrid model, which is based on a white based model for the building and systems and a data driven model for user behaviour, is to predict the energy demand and energy supply of a building. The model's application focuses on the use case of the TNO building at Stieltjesweg in Delft during a summer period, with a specific emphasis on cooling demand. Preliminary analysis shows that the monitoring results of the building behaviour is in line with the simulation results. Currently, development is in progress to improve the model predictions by including the solar shading from surrounding buildings, models of automatic shading devices, and model calibration including the energy use of the chiller. The goal of the third model is to derive recent and current ventilation flow rate over time based on monitoring data on CO₂ concentration and occupancy, as well as deriving recent and current occupancy over time, based on monitoring data on CO₂ concentration and ventilation flow rate. The grey-box model used is based on the GEKKO python tool. The model was tested with the data of 6 Windesheim University of Applied Sciences office rooms. The model had low precision deriving the ventilation flow rate, especially at low CO2 concentration rates. The model had a good precision deriving occupancy from CO₂ concentration and ventilation flow rate. Further research is needed to determine if these findings apply in different situations, such as meeting spaces and classrooms. The goal of the fourth chapter is to compare the working of a simplified white box model and black-box model to predict the heating energy use of a building. The aim is to integrate these prediction models in the energy management system of SME buildings. The two models have been tested with data from a residential unit since at the time of the analysis the data of a SME building was not available. The prediction models developed have a low accuracy and in their current form cannot be integrated in an energy management system. In general, black-box model prediction obtained a higher accuracy than the white box model. The goal of the fifth model is to predict the energy use in a building using a black-box model and measure the uncertainty in the prediction. The black-box model is based on a feed-forward neural network. The model has been tested with the data of two buildings: educational and commercial buildings. The strength of the model is in the ensemble prediction and the realization that uncertainty is intrinsically present in the data as an absolute deviation. Using a rolling window technique, the model can predict energy use and uncertainty, incorporating possible building-use changes. The testing in two different cases demonstrates the applicability of the model for different types of buildings. The goal of the sixth and last model developed is to predict the energy production of PV panels in a building with the use of a black-box model. The choice for developing the model of the PV panels is based on the analysis of the main contributors of the peak energy demand and peak energy delivery in the case of the DWA office building. On a fault free test set, the model meets the requirements for a calibrated model according to the FEMP and ASHRAE criteria for the error metrics. According to the IPMVP criteria the model should be improved further. The results of the performance metrics agree in range with values as found in literature. For accurate peak prediction a year of training data is recommended in the given approach without lagged variables. This report presents the results and lessons learned from implementing white-box, grey-box and black-box models to predict energy use and energy production of buildings or of variables directly related to them. Each of the models has its advantages and disadvantages. Further research in this line is needed to develop the potential of this approach.
DOCUMENT
This paper reports on the first stage of a research project1) that aims to incorporate objective measures of physical activity into health and lifestyle surveys. Physical activity is typically measured with questionnaires that are known to have measurement issues, and specifically, overestimate the amount of physical activity of the population. In a lab setting, 40 participants wore four different sensors on five different body parts, while performing various activities (sitting, standing, stepping with two intensities, bicycling with two intensities, walking stairs and jumping). During the first four activities, energy expenditure was measured by monitoring heart rate and the gas volume of in‐ and expired O2 and CO2. Participants subsequently wore two sensor systems (the ActivPAL on the thigh and the UKK on the waist) for a week. They also kept a diary keeping track of their physical activities, work and travel hours. Machine learning algorithms were trained with different methods to determine which sensor and which method was best able to differentiate the various activities and the intensity with which they were performed. It was found that the ActivPAL had the highest overall accuracy, possibly because the data generated on the upper tigh seems to be best distinguishing between different types of activities and therefore led to the highest accuracy. Accuracy could be slightly increased by including measures of heartrate. For recognizing intensity, three different measures were compared: allocation of MET values to activities (used by ActivPAL), median absolute deviation, and heart rate. It turns out that each method has merits and disadvantages, but median absolute deviation seems to be the most promishing metric. The search for the best method of gauging intensity is still ongoing. Subsequently, the algorithms developed for the lab data were used to determine physical activity in the week people wore the devices during their everyday activities. It quickly turned out that the models are far from ready to be used on free living data. Two approaches are suggested to remedy this: additional research with meticulously labelled free living data, e.g., by combining a Time Use Survey with accelerometer measurements. The second is to focus on better determining intensity of movement, e.g., with the help of unsupervised pattern recognition techniques. Accuracy was but one of the requirements for choosing a sensor system for subsequent research and ultimate implementation of sensor measurement in health surveys. Sensor position on the body, wearability, costs, usability, flexibility of analysis, response, and adherence to protocol equally determine the choice for a sensor. Also from these additional points of view, the activPAL is our sensor of choice.
DOCUMENT
In foul decision-making by football referees, visual search is important for gathering task-specific information to determine whether a foul has occurred. Yet, little is known about the visual search behaviours underpinning excellent on-field decisions. The aim of this study was to examine the on-field visual search behaviour of elite and sub-elite football referees when calling a foul during a match. In doing so, we have also compared the accuracy and gaze behaviour for correct and incorrect calls. Elite and sub-elite referees (elite: N = 5, Mage ± SD = 29.8 ± 4.7yrs, Mexperience ± SD = 14.8 ± 3.7yrs; sub-elite: N = 9, Mage ± SD = 23.1 ± 1.6yrs, Mexperience ± SD = 8.4 ± 1.8yrs) officiated an actual football game while wearing a mobile eye-tracker, with on-field visual search behaviour compared between skill levels when calling a foul (Nelite = 66; Nsub−elite = 92). Results revealed that elite referees relied on a higher search rate (more fixations of shorter duration) compared to sub-elites, but with no differences in where they allocated their gaze, indicating that elites searched faster but did not necessarily direct gaze towards different locations. Correct decisions were associated with higher gaze entropy (i.e. less structure). In relying on more structured gaze patterns when making incorrect decisions, referees may fail to pick-up information specific to the foul situation. Referee development programmes might benefit by challenging the speed of information pickup but by avoiding pre-determined gaze patterns to improve the interpretation of fouls and increase the decision-making performance of referees.
DOCUMENT