856 resultados para System analysis - Data processing
Resumo:
Seamless phase II/III clinical trials are conducted in two stages with treatment selection at the first stage. In the first stage, patients are randomized to a control or one of k > 1 experimental treatments. At the end of this stage, interim data are analysed, and a decision is made concerning which experimental treatment should continue to the second stage. If the primary endpoint is observable only after some period of follow-up, at the interim analysis data may be available on some early outcome on a larger number of patients than those for whom the primary endpoint is available. These early endpoint data can thus be used for treatment selection. For two previously proposed approaches, the power has been shown to be greater for one or other method depending on the true treatment effects and correlations. We propose a new approach that builds on the previously proposed approaches and uses data available at the interim analysis to estimate these parameters and then, on the basis of these estimates, chooses the treatment selection method with the highest probability of correctly selecting the most effective treatment. This method is shown to perform well compared with the two previously described methods for a wide range of true parameter values. In most cases, the performance of the new method is either similar to or, in some cases, better than either of the two previously proposed methods.
Resumo:
Environment monitoring applications using Wireless Sensor Networks (WSNs) have had a lot of attention in recent years. In much of this research tasks like sensor data processing, environment states and events decision making and emergency message sending are done by a remote server. A proposed cross layer protocol for two different applications where, reliability for delivered data, delay and life time of the network need to be considered, has been simulated and the results are presented in this paper. A WSN designed for the proposed applications needs efficient MAC and routing protocols to provide a guarantee for the reliability of the data delivered from source nodes to the sink. A cross layer based on the design given in [1] has been extended and simulated for the proposed applications, with new features, such as routes discovery algorithms added. Simulation results show that the proposed cross layer based protocol can conserve energy for nodes and provide the required performance such as life time of the network, delay and reliability.
Resumo:
Conceptualisations of disability that emphasise the contextual and cultural nature of disability and the embodiment of these within a national system of data collection present a number of challenges especially where this process is devolved to schools. The requirement for measures based on contextual and subjective experiences gives rise to particular difficulties in achieving parity in the way data is analysed and reported. This paper presents an account of the testing of a tool intended for use by schools as they collect data from parents to identify children who meet the criteria of disability established in Disability Discrimination Acts (DDAs). Data were validated through interviews with parents and teachers and observations of children and highlighted the pivotal role of the criterion of impact. The findings are set in the context of schools meeting their legal duties to identify disabled children and their support needs in a way that captures the complexity of disabled children’s school lives and provides useful and useable data.
Resumo:
This special issue is focused on the assessment of algorithms for the observation of Earth’s climate from environ- mental satellites. Climate data records derived by remote sensing are increasingly a key source of insight into the workings of and changes in Earth’s climate system. Producers of data sets must devote considerable effort and expertise to maximise the true climate signals in their products and minimise effects of data processing choices and changing sensors. A key choice is the selection of algorithm(s) for classification and/or retrieval of the climate variable. Within the European Space Agency Climate Change Initiative, science teams undertook systematic assessment of algorithms for a range of essential climate variables. The papers in the special issue report some of these exercises (for ocean colour, aerosol, ozone, greenhouse gases, clouds, soil moisture, sea surface temper- ature and glaciers). The contributions show that assessment exercises must be designed with care, considering issues such as the relative importance of different aspects of data quality (accuracy, precision, stability, sensitivity, coverage, etc.), the availability and degree of independence of validation data and the limitations of validation in characterising some important aspects of data (such as long-term stability or spatial coherence). As well as re- quiring a significant investment of expertise and effort, systematic comparisons are found to be highly valuable. They reveal the relative strengths and weaknesses of different algorithmic approaches under different observa- tional contexts, and help ensure that scientific conclusions drawn from climate data records are not influenced by observational artifacts, but are robust.
Resumo:
Sea-ice concentrations in the Laptev Sea simulated by the coupled North Atlantic-Arctic Ocean-Sea-Ice Model and Finite Element Sea-Ice Ocean Model are evaluated using sea-ice concentrations from Advanced Microwave Scanning Radiometer-Earth Observing System satellite data and a polynya classification method for winter 2007/08. While developed to simulate largescale sea-ice conditions, both models are analysed here in terms of polynya simulation. The main modification of both models in this study is the implementation of a landfast-ice mask. Simulated sea-ice fields from different model runs are compared with emphasis placed on the impact of this prescribed landfast-ice mask. We demonstrate that sea-ice models are not able to simulate flaw polynyas realistically when used without fast-ice description. Our investigations indicate that without landfast ice and with coarse horizontal resolution the models overestimate the fraction of open water in the polynya. This is not because a realistic polynya appears but due to a larger-scale reduction of ice concentrations and smoothed ice-concentration fields. After implementation of a landfast-ice mask, the polynya location is realistically simulated but the total open-water area is still overestimated in most cases. The study shows that the fast-ice parameterization is essential for model improvements. However, further improvements are necessary in order to progress from the simulation of large-scale features in the Arctic towards a more detailed simulation of smaller-scaled features (here polynyas) in an Arctic shelf sea.
Resumo:
A quality assessment of the CFC-11 (CCl3F), CFC-12 (CCl2F2), HF, and SF6 products from limb-viewing satellite instruments is provided by means of a detailed intercomparison. The climatologies in the form of monthly zonal mean time series are obtained from HALOE, MIPAS, ACE-FTS, and HIRDLS within the time period 1991–2010. The intercomparisons focus on the mean biases of the monthly and annual zonal mean fields and aim to identify their vertical, latitudinal and temporal structure. The CFC evaluations (based on MIPAS, ACE-FTS and HIRDLS) reveal that the uncertainty in our knowledge of the atmospheric CFC-11 and CFC-12 mean state, as given by satellite data sets, is smallest in the tropics and mid-latitudes at altitudes below 50 and 20 hPa, respectively, with a 1σ multi-instrument spread of up to ±5 %. For HF, the situation is reversed. The two available data sets (HALOE and ACE-FTS) agree well above 100 hPa, with a spread in this region of ±5 to ±10 %, while at altitudes below 100 hPa the HF annual mean state is less well known, with a spread ±30 % and larger. The atmospheric SF6 annual mean states derived from two satellite data sets (MIPAS and ACE-FTS) show only very small differences with a spread of less than ±5 % and often below ±2.5 %. While the overall agreement among the climatological data sets is very good for large parts of the upper troposphere and lower stratosphere (CFCs, SF6) or middle stratosphere (HF), individual discrepancies have been identified. Pronounced deviations between the instrument climatologies exist for particular atmospheric regions which differ from gas to gas. Notable features are differently shaped isopleths in the subtropics, deviations in the vertical gradients in the lower stratosphere and in the meridional gradients in the upper troposphere, and inconsistencies in the seasonal cycle. Additionally, long-term drifts between the instruments have been identified for the CFC-11 and CFC-12 time series. The evaluations as a whole provide guidance on what data sets are the most reliable for applications such as studies of atmospheric transport and variability, model–measurement comparisons and detection of long-term trends. The data sets will be publicly available from the SPARC Data Centre and through PANGAEA (doi:10.1594/PANGAEA.849223).
Resumo:
This work describes a novel methodology for automatic contour extraction from 2D images of 3D neurons (e.g. camera lucida images and other types of 2D microscopy). Most contour-based shape analysis methods cannot be used to characterize such cells because of overlaps between neuronal processes. The proposed framework is specifically aimed at the problem of contour following even in presence of multiple overlaps. First, the input image is preprocessed in order to obtain an 8-connected skeleton with one-pixel-wide branches, as well as a set of critical regions (i.e., bifurcations and crossings). Next, for each subtree, the tracking stage iteratively labels all valid pixel of branches, tip to a critical region, where it determines the suitable direction to proceed. Finally, the labeled skeleton segments are followed in order to yield the parametric contour of the neuronal shape under analysis. The reported system was successfully tested with respect to several images and the results from a set of three neuron images are presented here, each pertaining to a different class, i.e. alpha, delta and epsilon ganglion cells, containing a total of 34 crossings. The algorithms successfully got across all these overlaps. The method has also been found to exhibit robustness even for images with close parallel segments. The proposed method is robust and may be implemented in an efficient manner. The introduction of this approach should pave the way for more systematic application of contour-based shape analysis methods in neuronal morphology. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
Thermal analysis has been extensively used to obtain information about drug-polymer interactions and to perform pre-formulation studies of pharmaceutical dosage forms. In this work, biodegradable microparticles of poly(D,L-lactide-co-glycolide) (PLGA) containing ciprofloxacin hydrochloride (CP) in various drug:polymer ratios were obtained by spray drying. The main purpose of this study was to investigate the effect of the spray drying process on the drug-polymer interactions and on the stability of microparticles using differential scanning calorimetry (DSC), thermogravimetry (TG) and derivative thermogravimetry (DTG) and infrared spectroscopy (IR). The results showed that the high levels of encapsulation efficiency were dependant on drug:polymer ratio. DSC and TG/DTG analyses showed that for physical mixtures of the microparticles components the thermal profiles were different from those signals obtained with the pure substances. Thermal analysis data disclosed that physical interaction between CP and PLGA in high temperatures had occurred. The DSC and TG profiles for drug-loaded microparticles were very similar to the physical mixtures of components and it was possible to characterize the thermal properties of microparticles according to drug content. These data indicated that the spray dryer technique does not affect the physicochemical properties of the microparticles. In addition, the results are in agreement with IR data analysis demonstrating that no significant chemical interaction occurs between CP and PLGA in both physical mixtures and microparticles. In conclusion, we have found that the spray drying procedure used in this work can be a secure methodology to produce CP-loaded microparticles. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
The cost of a road construction over its service life is a function of the design, quality of construction, maintenance strategies and maintenance operations. Unfortunately, designers often neglect a very important aspect which is the possibility to perform future maintenance activities. The focus is mainly on other aspects such as investment costs, traffic safety, aesthetic appearance, regional development and environmental effects. This licentiate thesis is a part of a Ph.D. project entitled “Road Design for lower maintenance costs” that aims to examine how the life-cycle costs can be optimized by selection of appropriate geometrical designs for the roads and their components. The result is expected to give a basis for a new method used in the road planning and design process using life-cycle cost analysis with particular emphasis on road maintenance. The project started with a review of literature with the intention to study conditions causing increased needs for road maintenance, the efforts made by the road authorities to satisfy those needs and the improvement potential by consideration of maintenance aspects during planning and design. An investigation was carried out to identify the problems which obstruct due consideration of maintenance aspects during the road planning and design process. This investigation focused mainly on the road planning and design process at the Swedish Road Administration. However, the road planning and design process in Denmark, Finland and Norway were also roughly evaluated to gain a broader knowledge about the research subject. The investigation was carried out in two phases: data collection and data analysis. Data was collected by semi-structured interviews with expert actors involved in planning, design and maintenance and by a review of design-related documents. Data analyses were carried out using a method called “Change Analysis”. This investigation revealed a complex combination of problems which result in inadequate consideration of maintenance aspects. Several urgent needs for changes to eliminate these problems were identified. Another study was carried out to develop a model for calculation of the repair costs for damages of different road barrier types and to analyse how factors such as road type, speed limits, barrier types, barrier placement, type of road section, alignment and seasonal effects affect the barrier damages and the associated repair costs. This study was carried out using a method called the “Case Study Research Method”. Data was collected from 1087 barrier repairs in two regional offices of the Swedish Road Administration, the Central Region and the Western Region. A table was established for both regions containing the repair cost per vehicle kilometre for different combinations of barrier types, road types and speed limits. This table can be used by the designers in the calculation of the life-cycle costs for different road barrier types.
Resumo:
Projektet omfattade undersökning och framtagande av ett solcellssystem med förmåga att försörja ett FTX-system i ett flerbostadshus från miljonprogrammet med el. För att kunna bedöma storlek och utformning av komponenter har information tagits genom: Informationssökning via databaser, kurslitteratur och intervjuer Simuleringar av solceller i datorprogrammet PVSYST Modulering av ventilationskanaler i datorprogrammet MagiCAD Syftet var främst att undersöka om det gick att få fram ett teoretiskt fungerande system med avseende på både solceller och ventilation. Beroende på vad resultatet blev skulle även ekonomin i projektet undersökas. Undersökningen visade att det teoretiskt ska gå att installera solceller för elframställning som klarar av att täcka FTX-systemets elbehov på årsbasis. Solcellerna bedöms även producera tillräckligt med el för viss övrig elkrävande utrustning under stora delar av året. Det visade sig även att det skulle gå att få solcellerna ekonomiskt lönsamma om en kalkyltid på 14 år används. Metoden som använts för dessa resultat är noga beskriven och är med små förändringar tillämpbar för ett stort antal byggnader i det svenska byggnadsbeståndet. En viktig slutsats är att om fastighetsägarna kan se 15 år fram i tiden för en investering i solenergi, skulle det innebära inte bara miljömässiga utan även ekonomiska vinster. Det finns redan idag kunnande, teknik och produkter för att utvinna en stor del av fastigheternas elbehov genom solens energi.
Resumo:
Objective: To develop a method for objective quantification of PD motor symptoms related to Off episodes and peak dose dyskinesias, using spiral data gathered by using a touch screen telemetry device. The aim was to objectively characterize predominant motor phenotypes (bradykinesia and dyskinesia), to help in automating the process of visual interpretation of movement anomalies in spirals as rated by movement disorder specialists. Background: A retrospective analysis was conducted on recordings from 65 patients with advanced idiopathic PD from nine different clinics in Sweden, recruited from January 2006 until August 2010. In addition to the patient group, 10 healthy elderly subjects were recruited. Upper limb movement data were collected using a touch screen telemetry device from home environments of the subjects. Measurements with the device were performed four times per day during week-long test periods. On each test occasion, the subjects were asked to trace pre-drawn Archimedean spirals, using the dominant hand. The pre-drawn spiral was shown on the screen of the device. The spiral test was repeated three times per test occasion and they were instructed to complete it within 10 seconds. The device had a sampling rate of 10Hz and measured both position and time-stamps (in milliseconds) of the pen tip. Methods: Four independent raters (FB, DH, AJ and DN) used a web interface that animated the spiral drawings and allowed them to observe different kinematic features during the drawing process and to rate task performance. Initially, a number of kinematic features were assessed including ‘impairment’, ‘speed’, ‘irregularity’ and ‘hesitation’ followed by marking the predominant motor phenotype on a 3-category scale: tremor, bradykinesia and/or choreatic dyskinesia. There were only 2 test occasions for which all the four raters either classified them as tremor or could not identify the motor phenotype. Therefore, the two main motor phenotype categories were bradykinesia and dyskinesia. ‘Impairment’ was rated on a scale from 0 (no impairment) to 10 (extremely severe) whereas ‘speed’, ‘irregularity’ and ‘hesitation’ were rated on a scale from 0 (normal) to 4 (extremely severe). The proposed data-driven method consisted of the following steps. Initially, 28 spatiotemporal features were extracted from the time series signals before being presented to a Multilayer Perceptron (MLP) classifier. The features were based on different kinematic quantities of spirals including radius, angle, speed and velocity with the aim of measuring the severity of involuntary symptoms and discriminate between PD-specific (bradykinesia) and/or treatment-induced symptoms (dyskinesia). A Principal Component Analysis was applied on the features to reduce their dimensions where 4 relevant principal components (PCs) were retained and used as inputs to the MLP classifier. Finally, the MLP classifier mapped these components to the corresponding visually assessed motor phenotype scores for automating the process of scoring the bradykinesia and dyskinesia in PD patients whilst they draw spirals using the touch screen device. For motor phenotype (bradykinesia vs. dyskinesia) classification, the stratified 10-fold cross validation technique was employed. Results: There were good agreements between the four raters when rating the individual kinematic features with intra-class correlation coefficient (ICC) of 0.88 for ‘impairment’, 0.74 for ‘speed’, 0.70 for ‘irregularity’, and moderate agreements when rating ‘hesitation’ with an ICC of 0.49. When assessing the two main motor phenotype categories (bradykinesia or dyskinesia) in animated spirals the agreements between the four raters ranged from fair to moderate. There were good correlations between mean ratings of the four raters on individual kinematic features and computed scores. The MLP classifier classified the motor phenotype that is bradykinesia or dyskinesia with an accuracy of 85% in relation to visual classifications of the four movement disorder specialists. The test-retest reliability of the four PCs across the three spiral test trials was good with Cronbach’s Alpha coefficients of 0.80, 0.82, 0.54 and 0.49, respectively. These results indicate that the computed scores are stable and consistent over time. Significant differences were found between the two groups (patients and healthy elderly subjects) in all the PCs, except for the PC3. Conclusions: The proposed method automatically assessed the severity of unwanted symptoms and could reasonably well discriminate between PD-specific and/or treatment-induced motor symptoms, in relation to visual assessments of movement disorder specialists. The objective assessments could provide a time-effect summary score that could be useful for improving decision-making during symptom evaluation of individualized treatment when the goal is to maximize functional On time for patients while minimizing their Off episodes and troublesome dyskinesias.
Resumo:
Drinking water utilities in urban areas are focused on finding smart solutions facing new challenges in their real-time operation because of limited water resources, intensive energy requirements, a growing population, a costly and ageing infrastructure, increasingly stringent regulations, and increased attention towards the environmental impact of water use. Such challenges force water managers to monitor and control not only water supply and distribution, but also consumer demand. This paper presents and discusses novel methodologies and procedures towards an integrated water resource management system based on advanced ICT technologies of automation and telecommunications for largely improving the efficiency of drinking water networks (DWN) in terms of water use, energy consumption, water loss minimization, and water quality guarantees. In particular, the paper addresses the first results of the European project EFFINET (FP7-ICT2011-8-318556) devoted to the monitoring and control of the DWN in Barcelona (Spain). Results are split in two levels according to different management objectives: (i) the monitoring level is concerned with all the aspects involved in the observation of the current state of a system and the detection/diagnosis of abnormal situations. It is achieved through sensors and communications technology, together with mathematical models; (ii) the control level is concerned with computing the best suitable and admissible control strategies for network actuators as to optimize a given set of operational goals related to the performance of the overall system. This level covers the network control (optimal management of water and energy) and the demand management (smart metering, efficient supply). The consideration of the Barcelona DWN as the case study will allow to prove the general applicability of the proposed integrated ICT solutions and their effectiveness in the management of DWN, with considerable savings of electricity costs and reduced water loss while ensuring the high European standards of water quality to citizens.
Resumo:
Drinking water distribution networks risk exposure to malicious or accidental contamination. Several levels of responses are conceivable. One of them consists to install a sensor network to monitor the system on real time. Once a contamination has been detected, this is also important to take appropriate counter-measures. In the SMaRT-OnlineWDN project, this relies on modeling to predict both hydraulics and water quality. An online model use makes identification of the contaminant source and simulation of the contaminated area possible. The objective of this paper is to present SMaRT-OnlineWDN experience and research results for hydraulic state estimation with sampling frequency of few minutes. A least squares problem with bound constraints is formulated to adjust demand class coefficient to best fit the observed values at a given time. The criterion is a Huber function to limit the influence of outliers. A Tikhonov regularization is introduced for consideration of prior information on the parameter vector. Then the Levenberg-Marquardt algorithm is applied that use derivative information for limiting the number of iterations. Confidence intervals for the state prediction are also given. The results are presented and discussed on real networks in France and Germany.
Resumo:
This paper presents a study carried out with customers with credit card of a large retailer to measure the risk of abandonment of a relationship, when this has already purchase history. Two activities are the most important in this study: the theoretical and methodological procedures. The first step was to the understanding of the problem, the importance of theme and the definition of search methods. The study brings a bibliographic survey comprising several authors and shows that the loyalty of customers is the basis that gives sustainability and profitability for organizations of various market segments, examines the satisfaction as the key to success for achievement and specially for the loyalty of customers. To perform this study were adjusted logistic-linear models and through the test Kolmogorov - Smirnov (KS) and the curve Receiver Operating Characteristic (ROC) selected the best model. Had been used cadastral and transactional data of 100,000 customers of credit card issuer, the software used was SPSS which is a modern system of data manipulation, statistical analysis and presentation graphics. In research, we identify the risk of each customer leave the product through a score.
Resumo:
Os processamentos de imagens orbitais efetuados através de técnicas de sensoriamento remoto geraram informações qualitativas de natureza textural (morfo-estruturas). Estas permitiram (1) o reconhecimento de áreas com diferentes padrões estruturais tendo diferentes potencialidades para a prospecção de fluorita, (2) a identificação de novos lineamentos estruturais potencialmente favoráveis à mineralização e (3) evidenciaram prolongamentos extensos para as principais estruturas mineralizadas, (4) às quais se associam um grande número de estruturas, antes desconhecidas, com grande potencial prospectivo. O aprimoramento de técnicas de classificação digital sobre produtos de razões de bandas e análise por componentes principais permitiu identificar a alteração hidrotermal associada às estruturas, incorporando novos critérios para a prospecção de fluorita. Buscando-se quantificar os dados de alteração hidrotermal, foi efetuada a análise espectrorradiométrica das rochas do distrito fluorítico. Integrando estas informações com dados TM LANDSAT 5, em nível de reflectância, obteve-se a classificação espectral das imagens orbitais, o que permitiu a identificação de estruturas menores com um detalhe nunca antes obtido. Os processamentos de dados aerogeofísicos forneceram resultados sobre estruturas (magnetometria) e corpos graníticos afetados por alteração hidrotermal (aerogamaespectrometria). Estes produtos foram integrados com dados TM LANDSAT 5 associando o atributo textural da imagem orbital ao comportamento radiométrico das rochas. Diagnosticou-se o lineamento Grão-Pará como o principal prospecto do distrito. E levantaram-se uma série de dados sobre a compartimentação tectônica da região, a zonação de fácies das rochas graníticas (rocha fonte do flúor) e as alterações hidrotermais associadas ao magmatismo granítico. Isto permitiu a compreensão da distribuição regional dos depósitos de fluorita, adicionando-se um novo critério à prospecção de fluorita, a relação espacial entre a mineralização e a rocha fonte de F. Esta última corresponde à fácies granítica da borda do Maciço Pedras Grandes.