Several models in data analysis are estimated by minimizing the objective function defined as the residual sum of squares between the model and the data. A necessary and sufficient condition for the existence of a least squares estimator is that the objective function attains its infimum at a unique point. It is shown that the objective function for Parafac-2 need not attain its infimum, and that of DEDICOM, constrained Parafac-2, and, under a weak assumption, SCA and Dynamals do attain their infimum. Furthermore, the sequence of parameter vectors, generated by an alternating least squares algorithm, converges if it decreases the objective function to its infimum which is attained at one or finitely many points.
LINK
Estimation of the factor model by unweighted least squares (ULS) is distribution free, yields consistent estimates, and is computationally fast if the Minimum Residuals (MinRes) algorithm is employed. MinRes algorithms produce a converging sequence of monotonically decreasing ULS function values. Various suggestions for algorithms of the MinRes type are made for confirmatory as well as for exploratory factor analysis. These suggestions include the implementation of inequality constraints and the prevention of Heywood cases. A simulation study, comparing the bootstrap standard deviations for the parameters with the standard errors from maximum likelihood, indicates that these are virtually equal when the score vectors are sampled from the normal distribution. Two empirical examples demonstrate the usefulness of constrained exploratory and confirmatory factor analysis by ULS used in conjunction with the bootstrap method.
DOCUMENT
Author supplied: "This paper gives a linearised adjustment model for the affine, similarity and congruence transformations in 3D that is easily extendable with other parameters to describe deformations. The model considers all coordinates stochastic. Full positive semi-definite covariance matrices and correlation between epochs can be handled. The determination of transformation parameters between two or more coordinate sets, determined by geodetic monitoring measurements, can be handled as a least squares adjustment problem. It can be solved without linearisation of the functional model, if it concerns an affine, similarity or congruence transformation in one-, two- or three-dimensional space. If the functional model describes more than such a transformation, it is hardly ever possible to find a direct solution for the transformation parameters. Linearisation of the functional model and applying least squares formulas is then an appropriate mode of working. The adjustment model is given as a model of observation equations with constraints on the parameters. The starting point is the affine transformation, whose parameters are constrained to get the parameters of the similarity or congruence transformation. In this way the use of Euler angles is avoided. Because the model is linearised, iteration is necessary to get the final solution. In each iteration step approximate coordinates are necessary that fulfil the constraints. For the affine transformation it is easy to get approximate coordinates. For the similarity and congruence transformation the approximate coordinates have to comply to constraints. To achieve this, use is made of the singular value decomposition of the rotation matrix. To show the effectiveness of the proposed adjustment model total station measurements in two epochs of monitored buildings are analysed. Coordinate sets with full, rank deficient covariance matrices are determined from the measurements and adjusted with the proposed model. Testing the adjustment for deformations results in detection of the simulated deformations."
MULTIFILE
In our highly digitalized society, cybercrime has become a common crime. However, because research into cybercriminals is in its infancy, our knowledge about cybercriminals is still limited. One of the main considerations is whether cybercriminals have higher intellectual capabilities than traditional criminals or even the general population. Although criminological studies clearly show that traditional criminals have lower intellectual capabilities, little is known about the relationship between cybercrime and intelligence. The current study adds to the literature by exploring the relationship between CITO-test scores and cybercrime in the Netherlands. The CITO final test is a standardized test for primary school students - usually taken at the age of 11 or 12 - and highly correlated with IQ-scores. Data from Statistics Netherlands were used to compare CITO-test scores of 143 apprehended cybercriminals with those of 143 apprehended traditional criminals and 143 non-criminals, matched on age, sex, and country of birth. Ordinary Least Squares regression analyses were used to compare CITO test scores between cybercriminals, traditional criminals, and non-criminals. Additionally, a discordant sibling design was used to control for unmeasured confounding by family factors. Findings reveal that cybercriminals have significantly higher CITO test scores compared to traditional criminals and significantly lower CITO test scores compared to non-criminals.
DOCUMENT
This article provides a description of the emergence of the Spanish ‘Occupy’ movement, Democracia real ya. The aim is to analyse the innovative discursive features of this movement and to connect this analysis to what we consider the innovative potential of the critical sciences. The movement is the result of a spontaneous uprising that appeared on the main squares of Madrid and Barcelona on 15 May 2011 and then spread to other Spanish cities. This date gave it its name: 15M. While the struggle for democracy in Spain is certainly not new, the 15M group shows a series of innovative features. These include the emphasis on peaceful struggle and the imaginary of a new democracy or worldview, transmitted through innovative placards and slogans designed by Spanish citizens. We consider these innovative not only due to their creativity, but also because of their use as a form of civil action. Our argument is that these placards both functioned as a sign of protest and, in combination with the demonstrations and the general dynamics of 15M, helped to reframe the population’s understanding of the crisis and rearticulate the identity of the citizens from victims to agents. In order to analyse the multimodal character of this struggle, we developed an interdisciplinary methodology, which combines socio-cognitive approaches that consider ideological proposals as socio-cognitive constructs (i.e. the notion of narrative or cognitive frame), and Critical Discourse Analysis (CDA) in the analysis of discourses related to processes of social imagination and transformation. The socio-constructivist perspective is used to consider these discourses in relation to their actors, particular contexts and actions. The use of CDA, which included a careful rhetoric analysis, helped to analyse the process of deconstruction, transformation and reconstruction that 15M uses to maintain its struggle. The narrative analysis and the discursive theoretical concept of articulation helped to methodologically show aspects of the process of change alluded to above. This change was both in terms of cognition and in the modification of identity that turned a large part of the Spanish population from victims to indignados and to the neologism indignadanos, which is a composition of indignado and ciudadano (citizen).
DOCUMENT
INTRODUCTION: Innovations in head and neck cancer (HNC) treatment are often subject to economic evaluation prior to their reimbursement and subsequent access for patients. Mapping functions facilitate economic evaluation of new treatments when the required utility data is absent, but quality of life data is available. The objective of this study is to develop a mapping function translating the EORTC QLQ-C30 to EQ-5D-derived utilities for HNC through regression modeling, and to explore the added value of disease-specific EORTC QLQ-H&N35 scales to the model.METHODS: Data was obtained on patients with primary HNC treated with curative intent derived from two hospitals. Model development was conducted in two phases: 1. Predictor selection based on theory- and data-driven methods, resulting in three sets of potential predictors from the quality of life questionnaires; 2. Selection of the best out of four methods: ordinary-least squares, mixed-effects linear, Cox and beta regression, using the first set of predictors from EORTC QLQ-C30 scales with most correspondence to EQ-5D dimensions. Using a stepwise approach, we assessed added values of predictors in the other two sets. Model fit was assessed using Akaike and Bayesian Information Criterion (AIC and BIC) and model performance was evaluated by MAE, RMSE and limits of agreement (LOA).RESULTS: The beta regression model showed best model fit, with global health status, physical-, role- and emotional functioning and pain scales as predictors. Adding HNC-specific scales did not improve the model. Model performance was reasonable; R2 = 0.39, MAE = 0.0949, RMSE = 0.1209, 95% LOA of -0.243 to 0.231 (bias -0.01), with an error correlation of 0.32. The estimated shrinkage factor was 0.90.CONCLUSIONS: Selected scales from the EORTC QLQ-C30 can be used to estimate utilities for HNC using beta regression. Including EORTC QLQ-H&N35 scales does not improve the mapping function. The mapping model may serve as a tool to enable cost-effectiveness analyses of innovative HNC treatments, for example for reimbursement issues. Further research should assess the robustness and generalizability of the function by validating the model in an external cohort of HNC patients.
DOCUMENT
Due to the increase in scale due to (digital) technology, we are increasingly losing sight of our "common grounds". Our behaviour becomes bizarre, almost psychotic, but decimated by social media. In these Big Tech 'village squares ', we are both consumers and producers, like the sheep of yesteryear, overstimulated and overfed with hyper food, craving human contact.
MULTIFILE
Anomaly detection is a key factor in the processing of large amounts of sensor data from Wireless Sensor Networks (WSN). Efficient anomaly detection algorithms can be devised performing online node-local computations and reducing communication overhead, thus improving the use of the limited hardware resources. This work introduces a fixed-point embedded implementation of Online Sequential Extreme Learning Machine (OS-ELM), an online learning algorithm for Single Layer Feed forward Neural Networks (SLFN). To overcome the stability issues introduced by the fixed precision, we apply correction mechanisms previously proposed for Recursive Least Squares (RLS). The proposed implementation is tested extensively with generated and real-world datasets, and compared with RLS, Linear Least Squares Estimation, and a rule-based method as benchmarks. The methods are evaluated on the prediction accuracy and on the detection of anomalies. The experimental results demonstrate that fixed-point OS-ELM can be successfully implemented on resource-limited embedded systems, with guarantees of numerical stability. Furthermore, the detection accuracy of fixed-point OS-ELM shows better generalization properties in comparison with, for instance, fixed-point RLS. © 2013 IEEE.
DOCUMENT
The maturing field of Wireless Sensor Networks (WSN) results in long-lived deployments that produce large amounts of sensor data. Lightweight online on-mote processing may improve the usage of their limited resources, such as energy, by transmitting only unexpected sensor data (anomalies). We detect anomalies by analyzing sensor reading predictions from a linear model. We use Recursive Least Squares (RLS) to estimate the model parameters, because for large datasets the standard Linear Least Squares Estimation (LLSE) is not resource friendly. We evaluate the use of fixed-point RLS with adaptive thresholding, and its application to anomaly detection in embedded systems. We present an extensive experimental campaign on generated and real-world datasets, with floating-point RLS, LLSE, and a rule-based method as benchmarks. The methods are evaluated on prediction accuracy of the models, and on detection of anomalies, which are injected in the generated dataset. The experimental results show that the proposed algorithm is comparable, in terms of prediction accuracy and detection performance, to the other LS methods. However, fixed-point RLS is efficiently implementable in embedded devices. The presented method enables online on-mote anomaly detection with results comparable to offline LS methods. © 2013 IEEE.
DOCUMENT
Aim of this presentation was to stress the utmost importance of gaining insight in the physical-spatial quality and context of the urban fabric as a whole before venturing out into the realm of transformation design proposals. Large-scale areas or confined objects alike, the quality of the urban frame is precondition to the socioeconomic efficacy of the programme in question and the role and position of public spaces such as squares, parks, (main) streets and urban axes.
DOCUMENT