953 resultados para statistical narrow band model


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We report on experimental studies of the Kondo physics and the development of non-Fermi-liquid scaling in UCu(4+x)Al(8-x) family. We studied 7 different compounds with compositions between x = 0 and 2. We measured electrical transport (down to 65 mK) and thermoelectric power (down to 1.8 K) as a function of temperature, hydrostatic pressure, and/or magnetic field. Compounds with Cu content below x = 1.25 exhibit long-range antiferromagnetic order at low temperatures. Magnetic order is suppressed with increasing Cu content and our data indicate a possible quantum critical point at x(cr) approximate to 1.15. For compounds with higher Cu content, non-Fermi-liquid behavior is observed. Non-Fermi-liquid scaling is inferred from electrical resistivity results for the x = 1.25 and 1.5 compounds. For compounds with even higher Cu content, a sharp kink occurs in the resistivity data at low temperatures, and this may be indicative of another quantum critical point that occurs at higher Cu compositions. For the magnetically ordered compounds, hydrostatic pressure is found to increase the Neel temperature, which can be understood in terms of the Kondo physics. For the non-magnetic compounds, application of a magnetic field promotes a tendency toward Fermi-liquid behavior. Thermoelectric power was analyzed using a two-band Lorentzian model, and the results indicate one fairly narrow band (10 meV and below) and a second broad band (around hundred meV). The results imply that there are two relevant energy scales that need to be considered for the physics in this family of compounds. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We contrast four distinct versions of the BCS-Bose statistical crossover theory according to the form assumed for the electron-number equation that accompanies the BCS gap equation. The four versions correspond to explicitly accounting for two-hole-(2h) as well as two-electron-(2e) Cooper pairs (CPs), or both in equal proportions, or only either kind. This follows from a recent generalization of the Bose-Einstein condensation (GBEC) statistical theory that includes not boson-boson interactions but rather 2e- and also (without loss of generality) 2h-CPs interacting with unpaired electrons and holes in a single-band model that is easily converted into a two-band model. The GBEC theory is essentially an extension of the Friedberg-Lee 1989 BEC theory of superconductors that excludes 2h-CPs. It can thus recover, when the numbers of 2h- and 2e-CPs in both BE-condensed and non-condensed states are separately equal, the BCS gap equation for all temperatures and couplings as well as the zero-temperature BCS (rigorous-upper-bound) condensation energy for all couplings. But ignoring either 2h- or 2e-CPs it can do neither. In particular, only half the BCS condensation energy is obtained in the two crossover versions ignoring either kind of CPs. We show how critical temperatures T-c from the original BCS-Bose crossover theory in 2D require unphysically large couplings for the Cooper/BCS model interaction to differ significantly from the T(c)s of ordinary BCS theory (where the number equation is substituted by the assumption that the chemical potential equals the Fermi energy). (c) 2007 Published by Elsevier B.V.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this letter, a semiautomatic method for road extraction in object space is proposed that combines a stereoscopic pair of low-resolution aerial images with a digital terrain model (DTM) structured as a triangulated irregular network (TIN). First, we formulate an objective function in the object space to allow the modeling of roads in 3-D. In this model, the TIN-based DTM allows the search for the optimal polyline to be restricted along a narrow band that is overlaid upon it. Finally, the optimal polyline for each road is obtained by optimizing the objective function using the dynamic programming optimization algorithm. A few seed points need to be supplied by an operator. To evaluate the performance of the proposed method, a set of experiments was designed using two stereoscopic pairs of low-resolution aerial images and a TIN-based DTM with an average resolution of 1 m. The experimental results showed that the proposed method worked properly, even when faced with anomalies along roads, such as obstructions caused by shadows and trees.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The south of Minas Gerais, Brazil stands out among various regions through its capacity for production of specialty coffees. Its potential, manifested through being one of the most award-winning Brazilian regions in recent years, has been recognized by the Cup of Excellence (COE). With the evident relationship between product quality and the environment in mind, the need arises for scientific studies to provide a foundation for discrimination of product origin, creating new methods for combating possible fraud. The aim of this study was to evaluate the use of carbon and nitrogen isotopes in discrimination of production environments of specialty coffees from the Serra da Mantiqueira of Minas Gerais by means of the discriminant model. Coffee samples were composed of ripe yellow and red fruits collected manually at altitudes below 1,000 m, from 1,000 to 1,200 m and above 1,200 m. The yellow and red fruits were subjected to dry processing and wet processing, with five replications. A total of 119 samples were used for discrimination of specialty coffee production environments by means of stable isotopes and statistical modeling. The model generated had an accuracy rate of 89% in discrimination of environments and was composed of the isotope variables of δ15N, δ13C, %C, %N, δD, δ18O (meteoric water) and sensory analysis scores. In addition, for the first time, discrimination of environments on a local geographic scale, within a single municipality, was proposed and successfully concluded. This shows that isotope analysis is an effective method in verifying geographic origin for specialty coffees.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

[EN] Filaments are narrow, shallow structures of cool water originating from the coast. They are typical features of the four main eastern boundary upwelling systems (EBUS). In spite of their significant biological and chemical roles, through the offshore exportation of nutrient-rich waters, the physical processes that generate them are still not completely understood. This paper is a process-oriented study of filament generation mechanisms. Our goal is twofold: firstly, to obtain a numerical solution able to well represent the characteristics of the filament off Cape Ghir (30°38'N, northwestern Africa) in the Canary EBUS and secondly, to explain its formation by a simple mechanism based on the balance of potential vorticity. The first goal is achieved by the use of the ROMS model (Regional Ocean Modeling System) in embedded domains around Cape Ghir, with a horizontal resolution going up to 1.5 km for the finest domain. The latter gets its initial and boundary conditions from a parent solution and is forced by climatological, high-resolution atmospheric fields. The modeled filaments display spatial, temporal and physical characteristics in agreement with the available in situ and satellite observations. This model solution is used as a reference to compare the results with a set of process-oriented experiments. These experiments allow us to reach the second objective. Their respective solution serves to highlight the contribution of various processes in the filament generation. Since the study is focused on general processes present under climatological forcing conditions, inter-annual forcing is not necessary. The underlying idea for the filament generation is the balance of potential vorticity in the Canary EBUS: the upwelling jet is characterized by negative relative vorticity and flows southward along a narrow band of uniform potential vorticity. In the vicinity of the cape, an injection of relative vorticity induced by the wind breaks the existing vorticity balance. The upwelling jet is prevented from continuing its way southward and has to turn offshore to follow lines of equal potential vorticity. The model results highlight the essential role of wind, associated with the particular topography (coastline and bottom) around the cape. The mechanism presented here is general and thus can be applied to other EBUS.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Direct observations, satellite measurements and paleo records reveal strong variability in the Atlantic subpolar gyre on various time scales. Here we show that variations of comparable amplitude can only be simulated in a coupled climate model in the proximity of a dynamical threshold. The threshold and the associated dynamic response is due to a positive feedback involving increased salt transport in the subpolar gyre and enhanced deep convection in its centre. A series of sensitivity experiments is performed with a coarse resolution ocean general circulation model coupled to a statistical-dynamical atmosphere model which in itself does not produce atmospheric variability. To simulate the impact of atmospheric variability, the model system is perturbed with freshwater forcing of varying, but small amplitude and multi-decadal to centennial periodicities and observational variations in wind stress. While both freshwater and wind-stress-forcing have a small direct effect on the strength of the subpolar gyre, the magnitude of the gyre's response is strongly increased in the vicinity of the threshold. Our results indicate that baroclinic self-amplification in the North Atlantic ocean can play an important role in presently observed SPG variability and thereby North Atlantic climate variability on multi-decadal scales.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Professor Sir David R. Cox (DRC) is widely acknowledged as among the most important scientists of the second half of the twentieth century. He inherited the mantle of statistical science from Pearson and Fisher, advanced their ideas, and translated statistical theory into practice so as to forever change the application of statistics in many fields, but especially biology and medicine. The logistic and proportional hazards models he substantially developed, are arguably among the most influential biostatistical methods in current practice. This paper looks forward over the period from DRC's 80th to 90th birthdays, to speculate about the future of biostatistics, drawing lessons from DRC's contributions along the way. We consider "Cox's model" of biostatistics, an approach to statistical science that: formulates scientific questions or quantities in terms of parameters gamma in probability models f(y; gamma) that represent in a parsimonious fashion, the underlying scientific mechanisms (Cox, 1997); partition the parameters gamma = theta, eta into a subset of interest theta and other "nuisance parameters" eta necessary to complete the probability distribution (Cox and Hinkley, 1974); develops methods of inference about the scientific quantities that depend as little as possible upon the nuisance parameters (Barndorff-Nielsen and Cox, 1989); and thinks critically about the appropriate conditional distribution on which to base infrences. We briefly review exciting biomedical and public health challenges that are capable of driving statistical developments in the next decade. We discuss the statistical models and model-based inferences central to the CM approach, contrasting them with computationally-intensive strategies for prediction and inference advocated by Breiman and others (e.g. Breiman, 2001) and to more traditional design-based methods of inference (Fisher, 1935). We discuss the hierarchical (multi-level) model as an example of the future challanges and opportunities for model-based inference. We then consider the role of conditional inference, a second key element of the CM. Recent examples from genetics are used to illustrate these ideas. Finally, the paper examines causal inference and statistical computing, two other topics we believe will be central to biostatistics research and practice in the coming decade. Throughout the paper, we attempt to indicate how DRC's work and the "Cox Model" have set a standard of excellence to which all can aspire in the future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We describe a method for evaluating an ensemble of predictive models given a sample of observations comprising the model predictions and the outcome event measured with error. Our formulation allows us to simultaneously estimate measurement error parameters, true outcome — aka the gold standard — and a relative weighting of the predictive scores. We describe conditions necessary to estimate the gold standard and for these estimates to be calibrated and detail how our approach is related to, but distinct from, standard model combination techniques. We apply our approach to data from a study to evaluate a collection of BRCA1/BRCA2 gene mutation prediction scores. In this example, genotype is measured with error by one or more genetic assays. We estimate true genotype for each individual in the dataset, operating characteristics of the commonly used genotyping procedures and a relative weighting of the scores. Finally, we compare the scores against the gold standard genotype and find that Mendelian scores are, on average, the more refined and better calibrated of those considered and that the comparison is sensitive to measurement error in the gold standard.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A novel solution to the long standing issue of chip entanglement and breakage in metal cutting is presented in this dissertation. Through this work, an attempt is made to achieve universal chip control in machining by using chip guidance and subsequent breakage by backward bending (tensile loading of the chip's rough top surface) to effectively control long continuous chips into small segments. One big limitation of using chip breaker geometries in disposable carbide inserts is that the application range is limited to a narrow band depending on cutting conditions. Even within a recommended operating range, chip breakers do not function effectively as designed due to the inherent variations of the cutting process. Moreover, for a particular process, matching the chip breaker geometry with the right cutting conditions to achieve effective chip control is a very iterative process. The existence of a large variety of proprietary chip breaker designs further exacerbates the problem of easily implementing a robust and comprehensive chip control technique. To address the need for a robust and universal chip control technique, a new method is proposed in this work. By using a single tool top form geometry coupled with a tooling system for inducing chip breaking by backward bending, the proposed method achieves comprehensive chip control over a wide range of cutting conditions. A geometry based model is developed to predict a variable edge inclination angle that guides the chip flow to a predetermined target location. Chip kinematics for the new tool geometry is examined via photographic evidence from experimental cutting trials. Both qualitative and quantitative methods are used to characterize the chip kinematics. Results from the chip characterization studies indicate that the chip flow and final form show a remarkable consistency across multiple levels of workpiece and tool configurations as well as cutting conditions. A new tooling system is then designed to comprehensively break the chip by backward bending. Test results with the new tooling system prove that by utilizing the chip guidance and backward bending mechanism, long continuous chips can be more consistently broken into smaller segments that are generally deemed acceptable or good chips. It is found that the proposed tool can be applied effectively over a wider range of cutting conditions than present chip breakers thus taking possibly the first step towards achieving universal chip control in machining.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The dissertation titled "Driver Safety in Far-side and Far-oblique Crashes" presents a novel approach to assessing vehicle cockpit safety by integrating Human Factors and Applied Mechanics. The methodology of this approach is aimed at improving safety in compact mobile workspaces such as patrol vehicle cockpits. A statistical analysis performed using Michigan state's traffic crash data to assess various contributing factors that affect the risk of severe driver injuries showed that the risk was greater for unrestrained drivers (OR=3.38, p<0.0001) and for incidents involving front and far-side crashes without seatbelts (OR=8.0 and 23.0 respectively, p<0.005). Statistics also showed that near-side and far-side crashes pose similar threat to driver injury severity. A Human Factor survey was conducted to assess various Human-Machine/Human-Computer Interaction aspects in patrol vehicle cockpits. Results showed that tasks requiring manual operation, especially the usage of laptop, would require more attention and potentially cause more distraction. A vehicle survey conducted to evaluate ergonomics-related issues revealed that some of the equipment was in airbag deployment zones. In addition, experiments were conducted to assess the effects on driver distraction caused by changing the position of in-car accessories. A driving simulator study was conducted to mimic HMI/HCI in a patrol vehicle cockpit (20 subjects, average driving experience = 5.35 years, s.d. = 1.8). It was found that the mounting locations of manual tasks did not result in a significant change in response times. Visual displays resulted in response times less than 1.5sec. It can also be concluded that the manual task was equally distracting regardless of mounting positions (average response time was 15 secs). Average speeds and lane deviations did not show any significant results. Data from 13 full-scale sled tests conducted to simulate far-side impacts at 70 PDOF and 40 PDOF was used to analyze head injuries and HIC/AIS values. It was found that accelerations generated by the vehicle deceleration alone were high enough to cause AIS 3 - AIS 6 injuries. Pretensioners could mitigated injuries only in 40 PDOF (oblique) impacts but are useless in 70 PDOF impacts. Seat belts were ineffective in protecting the driver's head from injuries. Head would come in contact with the laptop during a far-oblique (40 PDOF) crash and far-side door for an angle-type crash (70 PDOF). Finite Element analysis head-laptop impact interaction showed that the contact velocity was the most crucial factor in causing a severe (and potentially fatal) head injury. Results indicate that no equipment may be mounted in driver trajectory envelopes. A very narrow band of space is left in patrol vehicles for installation of manual-task equipment to be both safe and ergonomic. In case of a contact, the material stiffness and damping properties play a very significant role in determining the injury outcome. Future work may be done on improving the interiors' material properties to better absorb and dissipate kinetic energy of the head. The design of seat belts and pretensioners may also be seen as an essential aspect to be further improved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We analyzed 580 integrated scrape-samples from HPC Site 480 for organic and carbonate carbon. Once precise dating is available, these will provide a high-resolution framework for understanding late Quaternary Oceanographic and climatic fluctuations in this region. Organic carbon ranges mostly within a narrow band of 1.8 to 3.5% C. Calcium carbonate varies from undetectable to over 20%, with an average of only about 5%. Source of carbonate are mostly benthic and planktonic foraminifers, although some sections are dominated by diagenetic carbonate, shelly hash, or nannofossils. Detrital sources are low in carbonate. We divided the sequence into 17 calcium carbonate (CC) zones to separate pulses, low and median values. The CC-Zones show various second-order patterns of cyclicity, asymmetry, and events. Laminated zones have lowest uniform values, but a perfect correlation between carbonate content and homogeneous or laminated facies was not found. Maximum values tend to be located near the transition of these two sediment types, showing that accumulation of carbonate is favored during times of breakdown of stable Oceanographic conditions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present the data used to construct the Cenozoic and Cretaceous portion of the Phanerozoic curve of seawater 87Sr/86Sr that had been given in summary form by W.H. Burke co-workers. All Cenozoic samples (128) and 22 Cretaceous samples are foram-nannofossil oozes and limestones from DSDP cores distributed among 13 sites in the Atlantic, Pacific and Indian Oceans, and the Caribbean Sea. Non-DSDP Cretaceous samples (126) include limestone, anhydrite and phosphate samples from North America, Europe and Asia. Determination of the 87Sr/86Sr value of seawater at particular times in the past is based on comparison of ratios derived from coeval marine samples from widely separated geographic areas. These samples are characterized by a wide variety of diagenetic and burial histories. The large size and cosmopolitan nature of the data set decreases the likelihood that, among coeval data, systematic error has been introduced by a similar pattern of diagenetic alteration of the ratios. There is good clustering of data points throughout the Cenozoic and Cretaceous curve. The consistency of data is illustrated by Cenozoic and Cretaceous data plots that include a separate symbol for each DSDP site and non-DSDP sample location. More than 98% of the data points are enclosed by upper and lower lines that define a narrow band. For any given time, the correct seawater ratio probably lies within this band. A line drawn within the band represents our estimate of the actual seawater ratio as a function of time. The general configuration of the Cenozoic and Cretaceous curve appears to be strongly influenced by the history of plate interactions and sea-floor spreading. Specific rises and falls in the 87Sr/86Sr of seawater, however, may be caused by a variety of factors such as variation in lithologic composition of the crust exposed to weathering, configuration and topographic relief of continents, volcanic activity, rate of sea-floor spreading, extent of continental inundation by epeiric seas, and variations in both climate and paleooceanographic conditions. Many or all of these factors are probably related to global tectonic processes, yet their combined effect on the temporal variation of seawater 87Sr/86Sr can complicate a direct platetectonic interpretation for portions of the seawater curve.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An increasing number of neuroimaging studies are concerned with the identification of interactions or statistical dependencies between brain areas. Dependencies between the activities of different brain regions can be quantified with functional connectivity measures such as the cross-correlation coefficient. An important factor limiting the accuracy of such measures is the amount of empirical data available. For event-related protocols, the amount of data also affects the temporal resolution of the analysis. We use analytical expressions to calculate the amount of empirical data needed to establish whether a certain level of dependency is significant when the time series are autocorrelated, as is the case for biological signals. These analytical results are then contrasted with estimates from simulations based on real data recorded with magnetoencephalography during a resting-state paradigm and during the presentation of visual stimuli. Results indicate that, for broadband signals, 50–100 s of data is required to detect a true underlying cross-correlations coefficient of 0.05. This corresponds to a resolution of a few hundred milliseconds for typical event-related recordings. The required time window increases for narrow band signals as frequency decreases. For instance, approximately 3 times as much data is necessary for signals in the alpha band. Important implications can be derived for the design and interpretation of experiments to characterize weak interactions, which are potentially important for brain processing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is clear that in the near future much broader transmissions in the HF band will replace part of the current narrow band links. Our personal view is that a real wide band signal is infeasible in this environment because the usage is typically very intensive and may suffer interferences from all over the world. Therefore, we envision that dynamic multiband transmissions may provide better satisfactory performance. From the very beginning, we observed that real links with our broadband transceiver suffered interferences out of our multiband but within the acquisition bandwidth that degrade the expected performance. Therefore, we concluded that a mitigation structure is required that operates on severely saturated signals as the interference may be of much higher power. In this paper we address a procedure based on Higher Order Crossings (HOC) statistics that are able to extract most of the signal structure in the case where the amplitude is severely distorted and allows the estimation of the interference carrier frequency to command a variable notch filter that mitigates its effect in the analog domain.