893 resultados para data analysis: algorithms and implementation
Resumo:
Flicker is a power quality phenomenon that applies to cycle instability of light intensity resulting from supply voltage fluctuation, which, in turn can be caused by disturbances introduced during power generation, transmission or distribution. The standard EN 61000-4-15 which has been recently adopted also by the IEEE as IEEE Standard 1453 relies on the analysis of the supply voltage which is processed according to a suitable model of the lamp – human eye – brain chain. As for the lamp, an incandescent 60 W, 230 V, 50 Hz source is assumed. As far as the human eye – brain model is concerned, it is represented by the so-called flicker curve. Such a curve was determined several years ago by statistically analyzing the results of tests where people were subjected to flicker with different combinations of magnitude and frequency. The limitations of this standard approach to flicker evaluation are essentially two. First, the provided index of annoyance Pst can be related to an actual tiredness of the human visual system only if such an incandescent lamp is used. Moreover, the implemented response to flicker is “subjective” given that it relies on the people answers about their feelings. In the last 15 years, many scientific contributions have tackled these issues by investigating the possibility to develop a novel model of the eye-brain response to flicker and overcome the strict dependence of the standard on the kind of the light source. In this light of fact, this thesis is aimed at presenting an important contribution for a new Flickermeter. An improved visual system model using a physiological parameter that is the mean value of the pupil diameter, has been presented, thus allowing to get a more “objective” representation of the response to flicker. The system used to both generate flicker and measure the pupil diameter has been illustrated along with all the results of several experiments performed on the volunteers. The intent has been to demonstrate that the measurement of that geometrical parameter can give reliable information about the feeling of the human visual system to light flicker.
Resumo:
Ion channels are pore-forming proteins that regulate the flow of ions across biological cell membranes. Ion channels are fundamental in generating and regulating the electrical activity of cells in the nervous system and the contraction of muscolar cells. Solid-state nanopores are nanometer-scale pores located in electrically insulating membranes. They can be adopted as detectors of specific molecules in electrolytic solutions. Permeation of ions from one electrolytic solution to another, through a protein channel or a synthetic pore is a process of considerable importance and realistic analysis of the main dependencies of ion current on the geometrical and compositional characteristics of these structures are highly required. The project described by this thesis is an effort to improve the understanding of ion channels by devising methods for computer simulation that can predict channel conductance from channel structure. This project describes theory, algorithms and implementation techniques used to develop a novel 3-D numerical simulator of ion channels and synthetic nanopores based on the Brownian Dynamics technique. This numerical simulator could represent a valid tool for the study of protein ion channel and synthetic nanopores, allowing to investigate at the atomic-level the complex electrostatic interactions that determine channel conductance and ion selectivity. Moreover it will provide insights on how parameters like temperature, applied voltage, and pore shape could influence ion translocation dynamics. Furthermore it will help making predictions of conductance of given channel structures and it will add information like electrostatic potential or ionic concentrations throughout the simulation domain helping the understanding of ion flow through membrane pores.
Resumo:
A permutation is said to avoid a pattern if it does not contain any subsequence which is order-isomorphic to it. Donald Knuth, in the first volume of his celebrated book "The art of Computer Programming", observed that the permutations that can be computed (or, equivalently, sorted) by some particular data structures can be characterized in terms of pattern avoidance. In more recent years, the topic was reopened several times, while often in terms of sortable permutations rather than computable ones. The idea to sort permutations by using one of Knuth’s devices suggests to look for a deterministic procedure that decides, in linear time, if there exists a sequence of operations which is able to convert a given permutation into the identical one. In this thesis we show that, for the stack and the restricted deques, there exists an unique way to implement such a procedure. Moreover, we use these sorting procedures to create new sorting algorithms, and we prove some unexpected commutation properties between these procedures and the base step of bubblesort. We also show that the permutations that can be sorted by a combination of the base steps of bubblesort and its dual can be expressed, once again, in terms of pattern avoidance. In the final chapter we give an alternative proof of some enumerative results, in particular for the classes of permutations that can be sorted by the two restricted deques. It is well-known that the permutations that can be sorted through a restricted deque are counted by the Schrӧder numbers. In the thesis, we show how the deterministic sorting procedures yield a bijection between sortable permutations and Schrӧder paths.
Resumo:
Coral reefs are the most biodiverse ecosystems of the ocean and they provide notable ecosystem services. Nowadays, they are facing a number of local anthropogenic threats and environmental change is threatening their survivorship on a global scale. Large-scale monitoring is necessary to understand environmental changes and to perform useful conservation measurements. Governmental agencies are often underfunded and are not able of sustain the necessary spatial and temporal large-scale monitoring. To overcome the economic constrains, in some cases scientists can engage volunteers in environmental monitoring. Citizen Science enables the collection and analysis of scientific data at larger spatial and temporal scales than otherwise possible, addressing issues that are otherwise logistically or financially unfeasible. “STE: Scuba Tourism for the Environment” was a volunteer-based Red Sea coral reef biodiversity monitoring program. SCUBA divers and snorkelers were involved in the collection of data for 72 taxa, by completing survey questionnaires after their dives. In my thesis, I evaluated the reliability of the data collected by volunteers, comparing their questionnaires with those completed by professional scientists. Validation trials showed a sufficient level of reliability, indicating that non-specialists performed similarly to conservation volunteer divers on accurate transects. Using the data collected by volunteers, I developed a biodiversity index that revealed spatial trends across surveyed areas. The project results provided important feedbacks to the local authorities on the current health status of Red Sea coral reefs and on the effectiveness of the environmental management. I also analysed the spatial and temporal distribution of each surveyed taxa, identifying abundance trends related with anthropogenic impacts. Finally, I evaluated the effectiveness of the project to increase the environmental education of volunteers and showed that the participation in STEproject significantly increased both the knowledge on coral reef biology and ecology and the awareness of human behavioural impacts on the environment.
Resumo:
Over the time, Twitter has become a fundamental source of information for news. As a one step forward, researchers have tried to analyse if the tweets contain predictive power. In the past, in financial field, a lot of research has been done to propose a function which takes as input all the tweets for a particular stock or index s, analyse them and predict the stock or index price of s. In this work, we take an alternative approach: using the stock price and tweet information, we investigate following questions. 1. Is there any relation between the amount of tweets being generated and the stocks being exchanged? 2. Is there any relation between the sentiment of the tweets and stock prices? 3. What is the structure of the graph that describes the relationships between users?
Resumo:
The present work studies a km-scale data assimilation scheme based on a LETKF developed for the COSMO model. The aim is to evaluate the impact of the assimilation of two different types of data: temperature, humidity, pressure and wind data from conventional networks (SYNOP, TEMP, AIREP reports) and 3d reflectivity from radar volume. A 3-hourly continuous assimilation cycle has been implemented over an Italian domain, based on a 20 member ensemble, with boundary conditions provided from ECMWF ENS. Three different experiments have been run for evaluating the performance of the assimilation on one week in October 2014 during which Genova flood and Parma flood took place: a control run of the data assimilation cycle with assimilation of data from conventional networks only, a second run in which the SPPT scheme is activated into the COSMO model, a third run in which also reflectivity volumes from meteorological radar are assimilated. Objective evaluation of the experiments has been carried out both on case studies and on the entire week: check of the analysis increments, computing the Desroziers statistics for SYNOP, TEMP, AIREP and RADAR, over the Italian domain, verification of the analyses against data not assimilated (temperature at the lowest model level objectively verified against SYNOP data), and objective verification of the deterministic forecasts initialised with the KENDA analyses for each of the three experiments.
Resumo:
We use long instrumental temperature series together with available field reconstructions of sea-level pressure (SLP) and three-dimensional climate model simulations to analyze relations between temperature anomalies and atmospheric circulation patterns over much of Europe and the Mediterranean for the late winter/early spring (January–April, JFMA) season. A Canonical Correlation Analysis (CCA) investigates interannual to interdecadal covariability between a new gridded SLP field reconstruction and seven long instrumental temperature series covering the past 250 years. We then present and discuss prominent atmospheric circulation patterns related to anomalous warm and cold JFMA conditions within different European areas spanning the period 1760–2007. Next, using a data assimilation technique, we link gridded SLP data with a climate model (EC-Bilt-Clio) for a better dynamical understanding of the relationship between large scale circulation and European climate. We thus present an alternative approach to reconstruct climate for the pre-instrumental period based on the assimilated model simulations. Furthermore, we present an independent method to extend the dynamic circulation analysis for anomalously cold European JFMA conditions back to the sixteenth century. To this end, we use documentary records that are spatially representative for the long instrumental records and derive, through modern analogs, large-scale SLP, surface temperature and precipitation fields. The skill of the analog method is tested in the virtual world of two three-dimensional climate simulations (ECHO-G and HadCM3). This endeavor offers new possibilities to both constrain climate model into a reconstruction mode (through the assimilation approach) and to better asses documentary data in a quantitative way.
Does published orthodontic research account for clustering effects during statistical data analysis?
Resumo:
In orthodontics, multiple site observations within patients or multiple observations collected at consecutive time points are often encountered. Clustered designs require larger sample sizes compared to individual randomized trials and special statistical analyses that account for the fact that observations within clusters are correlated. It is the purpose of this study to assess to what degree clustering effects are considered during design and data analysis in the three major orthodontic journals. The contents of the most recent 24 issues of the American Journal of Orthodontics and Dentofacial Orthopedics (AJODO), Angle Orthodontist (AO), and European Journal of Orthodontics (EJO) from December 2010 backwards were hand searched. Articles with clustering effects and whether the authors accounted for clustering effects were identified. Additionally, information was collected on: involvement of a statistician, single or multicenter study, number of authors in the publication, geographical area, and statistical significance. From the 1584 articles, after exclusions, 1062 were assessed for clustering effects from which 250 (23.5 per cent) were considered to have clustering effects in the design (kappa = 0.92, 95 per cent CI: 0.67-0.99 for inter rater agreement). From the studies with clustering effects only, 63 (25.20 per cent) had indicated accounting for clustering effects. There was evidence that the studies published in the AO have higher odds of accounting for clustering effects [AO versus AJODO: odds ratio (OR) = 2.17, 95 per cent confidence interval (CI): 1.06-4.43, P = 0.03; EJO versus AJODO: OR = 1.90, 95 per cent CI: 0.84-4.24, non-significant; and EJO versus AO: OR = 1.15, 95 per cent CI: 0.57-2.33, non-significant). The results of this study indicate that only about a quarter of the studies with clustering effects account for this in statistical data analysis.
Resumo:
Though 3D computer graphics has seen tremendous advancement in the past two decades, most available mechanisms for computer interaction in 3D are high cost and targeted for industry and virtual reality applications. Recent advances in Micro-Electro-Mechanical-System (MEMS) devices have brought forth a variety of new low-cost, low-power, miniature sensors with high accuracy, which are well suited for hand-held devices. In this work a novel design for a 3D computer game controller using inertial sensors is proposed, and a prototype device based on this design is implemented. The design incorporates MEMS accelerometers and gyroscopes from Analog Devices to measure the three components of the acceleration and angular velocity. From these sensor readings, the position and orientation of the hand-held compartment can be calculated using numerical methods. The implemented prototype is utilizes a USB 2.0 compliant interface for power and communication with the host system. A Microchip dsPIC microcontroller is used in the design. This microcontroller integrates the analog to digital converters, the program memory flash, as well as the core processor, on a single integrated circuit. A PC running Microsoft Windows operating system is used as the host machine. Prototype firmware for the microcontroller is developed and tested to establish the communication between the design and the host, and perform the data acquisition and initial filtering of the sensor data. A PC front-end application with a graphical interface is developed to communicate with the device, and allow real-time visualization of the acquired data.
Resumo:
The widespread of low cost embedded electronics makes it easier to implement the smart devices that can understand either the environment or the user behaviors. The main object of this project is to design and implement home use portable smart electronics, including the portable monitoring device for home and office security and the portable 3D mouse for convenient use. Both devices in this project use the MPU6050 which contains a 3 axis accelerometer and a 3 axis gyroscope to sense the inertial motion of the door or the human hands movement. For the portable monitoring device for home and office security, MPU6050 is used to sense the door (either home front door or cabinet door) movement through the gyroscope, and Raspberry Pi is then used to process the data it receives from MPU6050, if the data value exceeds the preset threshold, Raspberry Pi would control the USB Webcam to take a picture and then send out an alert email with the picture to the user. The advantage of this device is that it is a small size portable stand-alone device with its own power source, it is easy to implement, really cheap for residential use, and energy efficient with instantaneous alert. For the 3D mouse, the MPU6050 would use both the accelerometer and gyroscope to sense user hands movement, the data are processed by MSP430G2553 through a digital smooth filter and a complementary filter, and then the filtered data will pass to the personal computer through the serial COM port. By applying the cursor movement equation in the PC driver, this device can work great as a mouse with acceptable accuracy. Compared to the normal optical mouse we are using, this mouse does not need any working surface, with the use of the smooth and complementary filter, it has certain accuracy for normal use, and it is easy to be extended to a portable mouse as small as a finger ring.
Resumo:
Desertification research conventionally focuses on the problem – that is, degradation – while neglecting the appraisal of successful conservation practices. Based on the premise that Sustainable Land Management (SLM) experiences are not sufficiently or comprehensively documented, evaluated, and shared, the World Overview of Conservation Approaches and Technologies (WOCAT) initiative (www.wocat.net), in collaboration with FAO’s Land Degradation Assessment in Drylands (LADA) project (www.fao.org/nr/lada/) and the EU’s DESIRE project (http://www.desire-project.eu/), has developed standardised tools and methods for compiling and evaluating the biophysical and socio-economic knowledge available about SLM. The tools allow SLM specialists to share their knowledge and assess the impact of SLM at the local, national, and global levels. As a whole, the WOCAT–LADA–DESIRE methodology comprises tools for documenting, self-evaluating, and assessing the impact of SLM practices, as well as for knowledge sharing and decision support in the field, at the planning level, and in scaling up identified good practices. SLM depends on flexibility and responsiveness to changing complex ecological and socioeconomic causes of land degradation. The WOCAT tools are designed to reflect and capture this capacity of SLM. In order to take account of new challenges and meet emerging needs of WOCAT users, the tools are constantly further developed and adapted. Recent enhancements include tools for improved data analysis (impact and cost/benefit), cross-scale mapping, climate change adaptation and disaster risk management, and easier reporting on SLM best practices to UNCCD and other national and international partners. Moreover, WOCAT has begun to give land users a voice by backing conventional documentation with video clips straight from the field. To promote the scaling up of SLM, WOCAT works with key institutions and partners at the local and national level, for example advisory services and implementation projects. Keywords: Sustainable Land Management (SLM), knowledge management, decision-making, WOCAT–LADA–DESIRE methodology.
Resumo:
A wealth of genetic associations for cardiovascular and metabolic phenotypes in humans has been accumulating over the last decade, in particular a large number of loci derived from recent genome wide association studies (GWAS). True complex disease-associated loci often exert modest effects, so their delineation currently requires integration of diverse phenotypic data from large studies to ensure robust meta-analyses. We have designed a gene-centric 50 K single nucleotide polymorphism (SNP) array to assess potentially relevant loci across a range of cardiovascular, metabolic and inflammatory syndromes. The array utilizes a "cosmopolitan" tagging approach to capture the genetic diversity across approximately 2,000 loci in populations represented in the HapMap and SeattleSNPs projects. The array content is informed by GWAS of vascular and inflammatory disease, expression quantitative trait loci implicated in atherosclerosis, pathway based approaches and comprehensive literature searching. The custom flexibility of the array platform facilitated interrogation of loci at differing stringencies, according to a gene prioritization strategy that allows saturation of high priority loci with a greater density of markers than the existing GWAS tools, particularly in African HapMap samples. We also demonstrate that the IBC array can be used to complement GWAS, increasing coverage in high priority CVD-related loci across all major HapMap populations. DNA from over 200,000 extensively phenotyped individuals will be genotyped with this array with a significant portion of the generated data being released into the academic domain facilitating in silico replication attempts, analyses of rare variants and cross-cohort meta-analyses in diverse populations. These datasets will also facilitate more robust secondary analyses, such as explorations with alternative genetic models, epistasis and gene-environment interactions.
Resumo:
Research on school-based sexual health education programs is at a critical juncture. With the growing number of evidenced-based programs, more focus is needed on how to help schools adopt and implement these programs. The article in this issue titled “Sexual Health Education from the Perspectives of School Staff: Implications for Adoption and Implementation of Effective Programs in Middle School” provides data on individual cognitive factors that may influence adoption and implementation. This commentary explores another framework, Concerns Based Adoption Model, as a tool for examining and supporting change associated with adoption and implementation of sexual health education programs.
Resumo:
Background: The US has higher rates of teen births and sexually transmitted infections (STI) than other developed countries. Texas youth are disproportionately impacted. Purpose: To review local, state, and national data on teens’ engagement in sexual risk behaviors to inform policy and practice related to teen sexual health. Methods: 2009 middle school and high school Youth Risk Behavior Survey (YRBS) data, and data from All About Youth, a middle school study conducted in a large urban school district in Texas, were analyzed to assess the prevalence of sexual initiation, including the initiation of non-coital sex, and the prevalence of sexual risk behaviors among Texas and US youth. Results: A substantial proportion of middle and high school students are having sex. Sexual initiation begins as early as 6th grade and increases steadily through 12th grade with almost two-thirds of high school seniors being sexually experienced. Many teens are not protecting themselves from unintended pregnancy or STIs – nationally, 80% and 39% of high school students did not use birth control pills or a condom respectively the last time they had sex. Many middle and high school students are engaging in oral and anal sex, two behaviors which increase the risk of contracting an STI and HIV. In Texas, an estimated 689,512 out of 1,327,815 public high school students are sexually experienced – over half (52%) of the total high school population. Texas students surpass their US peers in several sexual risk behaviors including number of lifetime sexual partners, being currently sexually active, and not using effective methods of birth control or dual protection when having sex. They are also less likely to receive HIV/AIDS education in school. Conclusion: Changes in policy and practice, including implementation of evidence-based sex education programs in middle and high schools and increased access to integrated, teen-friendly sexual and reproductive health services, are urgently needed at the state and national levels to address these issues effectively.