837 resultados para Entropy of a sampling design
Resumo:
The Interstellar Boundary Explorer (IBEX) has been directly observing neutral atoms from the local interstellar medium for the last six years (2009–2014). This paper ties together the 14 studies in this Astrophysical Journal Supplement Series Special Issue, which collectively describe the IBEX interstellar neutral results from this epoch and provide a number of other relevant theoretical and observational results. Interstellar neutrals interact with each other and with the ionized portion of the interstellar population in the “pristine” interstellar medium ahead of the heliosphere. Then, in the heliosphereʼs close vicinity, the interstellar medium begins to interact with escaping heliospheric neutrals. In this study, we compare the results from two major analysis approaches led by IBEX groups in New Hampshire and Warsaw. We also directly address the question of the distance upstream to the pristine interstellar medium and adjust both sets of results to a common distance of ~1000 AU. The two analysis approaches are quite different, but yield fully consistent measurements of the interstellar He flow properties, further validating our findings. While detailed error bars are given for both approaches, we recommend that for most purposes, the community use “working values” of ~25.4 km s⁻¹, ~75°7 ecliptic inflow longitude, ~−5°1 ecliptic inflow latitude, and ~7500 K temperature at ~1000 AU upstream. Finally, we briefly address future opportunities for even better interstellar neutral observations to be provided by the Interstellar Mapping and Acceleration Probe mission, which was recommended as the next major Heliophysics mission by the NRCʼs 2013 Decadal Survey.
Resumo:
Mixed longitudinal designs are important study designs for many areas of medical research. Mixed longitudinal studies have several advantages over cross-sectional or pure longitudinal studies, including shorter study completion time and ability to separate time and age effects, thus are an attractive choice. Statistical methodology used in general longitudinal studies has been rapidly developing within the last few decades. Common approaches for statistical modeling in studies with mixed longitudinal designs have been the linear mixed-effects model incorporating an age or time effect. The general linear mixed-effects model is considered an appropriate choice to analyze repeated measurements data in longitudinal studies. However, common use of linear mixed-effects model on mixed longitudinal studies often incorporates age as the only random-effect but fails to take into consideration the cohort effect in conducting statistical inferences on age-related trajectories of outcome measurements. We believe special attention should be paid to cohort effects when analyzing data in mixed longitudinal designs with multiple overlapping cohorts. Thus, this has become an important statistical issue to address. ^ This research aims to address statistical issues related to mixed longitudinal studies. The proposed study examined the existing statistical analysis methods for the mixed longitudinal designs and developed an alternative analytic method to incorporate effects from multiple overlapping cohorts as well as from different aged subjects. The proposed study used simulation to evaluate the performance of the proposed analytic method by comparing it with the commonly-used model. Finally, the study applied the proposed analytic method to the data collected by an existing study Project HeartBeat!, which had been evaluated using traditional analytic techniques. Project HeartBeat! is a longitudinal study of cardiovascular disease (CVD) risk factors in childhood and adolescence using a mixed longitudinal design. The proposed model was used to evaluate four blood lipids adjusting for age, gender, race/ethnicity, and endocrine hormones. The result of this dissertation suggest the proposed analytic model could be a more flexible and reliable choice than the traditional model in terms of fitting data to provide more accurate estimates in mixed longitudinal studies. Conceptually, the proposed model described in this study has useful features, including consideration of effects from multiple overlapping cohorts, and is an attractive approach for analyzing data in mixed longitudinal design studies.^
Resumo:
The Phase I clinical trial is considered the "first in human" study in medical research to examine the toxicity of a new agent. It determines the maximum tolerable dose (MTD) of a new agent, i.e., the highest dose in which toxicity is still acceptable. Several phase I clinical trial designs have been proposed in the past 30 years. The well known standard method, so called the 3+3 design, is widely accepted by clinicians since it is the easiest to implement and it does not need a statistical calculation. Continual reassessment method (CRM), a design uses Bayesian method, has been rising in popularity in the last two decades. Several variants of the CRM design have also been suggested in numerous statistical literatures. Rolling six is a new method introduced in pediatric oncology in 2008, which claims to shorten the trial duration as compared to the 3+3 design. The goal of the present research was to simulate clinical trials and compare these phase I clinical trial designs. Patient population was created by discrete event simulation (DES) method. The characteristics of the patients were generated by several distributions with the parameters derived from a historical phase I clinical trial data review. Patients were then selected and enrolled in clinical trials, each of which uses the 3+3 design, the rolling six, or the CRM design. Five scenarios of dose-toxicity relationship were used to compare the performance of the phase I clinical trial designs. One thousand trials were simulated per phase I clinical trial design per dose-toxicity scenario. The results showed the rolling six design was not superior to the 3+3 design in terms of trial duration. The time to trial completion was comparable between the rolling six and the 3+3 design. However, they both shorten the duration as compared to the two CRM designs. Both CRMs were superior to the 3+3 design and the rolling six in accuracy of MTD estimation. The 3+3 design and rolling six tended to assign more patients to undesired lower dose levels. The toxicities were slightly greater in the CRMs.^
Resumo:
IBAMar (http://www.ba.ieo.es/ibamar) is a regional database that puts together all physical and biochemical data obtained by multiparametric probes (CTDs equipped with different sensors), during the cruises managed by the Balearic Center of the Spanish Institute of Oceanography (COB-IEO). It has been recently extended to include data obtained with classical hydro casts using oceanographic Niskin or Nansen bottles. The result is a database that includes a main core of hydrographic data: temperature (T), salinity (S), dissolved oxygen (DO), fluorescence and turbidity; complemented by bio-chemical data: dissolved inorganic nutrients (phosphate, nitrate, nitrite and silicate) and chlorophyll-a. In IBAMar Database, different technologies and methodologies were used by different teams along the four decades of data sampling in the COB-IEO. Despite of this fact, data have been reprocessed using the same protocols, and a standard QC has been applied to each variable. Therefore it provides a regional database of homogeneous, good quality data. Data acquisition and quality control (QC): 94% of the data are CTDs Sbe911 and Sbe25. S and DO were calibrated on board using water samples, whenever a Rossetta was available (70% of the cases). All CTD data from Seabird CTDs were reviewed and post processed with the software provided by Sea-Bird Electronics. Data were averaged to get 1 dbar vertical resolution. General sampling methodology and pre processing are described in https://ibamardatabase.wordpress.com/home/). Manual QC include visual checks of metadata, duplicate data and outliers. Automatic QC include range check of variables by area (north of Balearic Islands, south of BI and Alboran Sea) and depth (27 standard levels), check for spikes and check for density inversions. Nutrients QC includes a preliminary control and a range check on the observed level of the data to detect outliers around objectively analyzed data fields. A quality flag is assigned as an integer number, depending on the result of the QC check.
Resumo:
The Simultaneous Multiple Surfaces (SMS) was developed as a design method in Nonimaging Optics during the 90s. Later, the method was extended for designing Imaging Optics. We present an overview of the method applied to imaging optics in planar (2D) geometry and compare the results with more classical designs based on achieving aplanatism of different orders. These classical designs are also viewed as particular cases of SMS designs. Systems with up to 4 aspheric surfaces are shown. The SMS design strategy is shown to perform always better than the classical design (in terms of image quality). Moreover, the SMS method is a direct method, i.e., it is not based in multi-parametric optimization techniques. This gives the SMS method an additional interest since it can be used for exploring solutions where the multiparameter techniques can get lost because of the multiple local minima
Resumo:
We study the notion of approximate entropy within the framework of network theory. Approximate entropy is an uncertainty measure originally proposed in the context of dynamical systems and time series. We first define a purely structural entropy obtained by computing the approximate entropy of the so-called slide sequence. This is a surrogate of the degree sequence and it is suggested by the frequency partition of a graph. We examine this quantity for standard scale-free and Erdös-Rényi networks. By using classical results of Pincus, we show that our entropy measure often converges with network size to a certain binary Shannon entropy. As a second step, with specific attention to networks generated by dynamical processes, we investigate approximate entropy of horizontal visibility graphs. Visibility graphs allow us to naturally associate with a network the notion of temporal correlations, therefore providing the measure a dynamical garment. We show that approximate entropy distinguishes visibility graphs generated by processes with different complexity. The result probes to a greater extent these networks for the study of dynamical systems. Applications to certain biological data arising in cancer genomics are finally considered in the light of both approaches.
Resumo:
Auxetic materials (or metamaterials) are those with a negative Poisson ratio (NPR) and display the unexpected property of lateral expansion when stretched, as well as an equal and opposing densification when compressed. Such geometries are being progressively employed in the development of novel products, especially in the fields of intelligent expandable actuators, shape morphing structures and minimally invasive implantable devices. Although several auxetic and potentially auxetic geometries have been summarized in previous reviews and research, precise information regarding relevant properties for design tasks is not always provided. In this study we present a comparative study of two-dimensional and three-dimensional auxetic geometries carried out by means of computer-aided design and engineering tools (from now on CAD–CAE). The first part of the study is focused on the development of a CAD library of auxetics. Once the library is developed we simulate the behavior of the different auxetic geometries and elaborate a systematic comparison, considering relevant properties of these geometries, such as Poisson ratio(s), maximum volume or area reductions attainable and equivalent Young's modulus, hoping it may provide useful information for future designs of devices based on these interesting structures.
Resumo:
El desarrollo del presente trabajo sigue, tanto una línea cronológica de las tareas realizadas, como una lógica, en la que se parte de un conocimiento mínimo de los sistemas espaciales hasta llegar al diseño completo de un Módulo de Cálculo de Potencia Eléctrica de un satélite para su aplicación en una instalación de diseño concurrente o CDF.
Resumo:
Esta tesis está dedicada al análisis de las guías de onda y el diseño de los componentes pasivos con énfasis en aplicaciones de alta frecuencia. En primer lugar, se lleva a cabo el análisis de las guías de onda con conductores metálicos no ideales, con el objetivo de establecer el límite superior en frecuencia de las aproximaciones habitualmente utilizadas en microondas para el cálculo de las pérdidas óhmicas. Posteriormente, se presenta el diseño de diferentes componentes pasivos de guía de ondas: filtros, transductores de modos ortogonales (OMT), polarizadores, duplexores y alimentadores de antena, funcionando en frecuencias desde 10 a 750 GHz. Para el correcto diseño de componentes a altas frecuencias se requiere, en primer lugar, comprender los nuevos procesos de fabricación y después adecuar los diversos componentes para cumplir especificaciones eléctricas y geométricas simultáneamente. Para esto, se presentan modificaciones y nuevas geometrías de guiado de ondas para diferentes aplicaciones y procesos tecnológicos. Además se discuten sus ventajas sobre las soluciones ya existentes. Además, el trabajo presentado en esta tesis se ocupa del desarrollo completo de dispositivos: diseño, fabricación y caracterización de los componentes ya mencionados. Por último, algunos de los dispositivos desarrollados han sido diseñados para ser integrados en diferentes sistemas. De esta forma, se mejoran las prestaciones y capacidades de dichos sistemas.
Resumo:
Automated Teller Machines (ATMs) are sensitive self-service systems that require important investments in security and testing. ATM certifications are testing processes for machines that integrate software components from different vendors and are performed before their deployment for public use. This project was originated from the need of optimization of the certification process in an ATM manufacturing company. The process identifies compatibility problems between software components through testing. It is composed by a huge number of manual user tasks that makes the process very expensive and error-prone. Moreover, it is not possible to fully automate the process as it requires human intervention for manipulating ATM peripherals. This project presented important challenges for the development team. First, this is a critical process, as all the ATM operations rely on the software under test. Second, the context of use of ATMs applications is vastly different from ordinary software. Third, ATMs’ useful lifetime is beyond 15 years and both new and old models need to be supported. Fourth, the know-how for efficient testing depends on each specialist and it is not explicitly documented. Fifth, the huge number of tests and their importance implies the need for user efficiency and accuracy. All these factors led us conclude that besides the technical challenges, the usability of the intended software solution was critical for the project success. This business context is the motivation of this Master Thesis project. Our proposal focused in the development process applied. By combining user-centered design (UCD) with agile development we ensured both the high priority of usability and the early mitigation of software development risks caused by all the technology constraints. We performed 23 development iterations and finally we were able to provide a working solution on time according to users’ expectations. The evaluation of the project was carried out through usability tests, where 4 real users participated in different tests in the real context of use. The results were positive, according to different metrics: error rate, efficiency, effectiveness, and user satisfaction. We discuss the problems found, the benefits and the lessons learned in the process. Finally, we measured the expected project benefits by comparing the effort required by the current and the new process (once the new software tool is adopted). The savings corresponded to 40% less effort (man-hours) per certification. Future work includes additional evaluation of product usability in a real scenario (with customers) and the measuring of benefits in terms of quality improvement.