929 resultados para Maximum Power Point Tracking (MPPT)
Resumo:
HIV virulence, i.e. the time of progression to AIDS, varies greatly among patients. As for other rapidly evolving pathogens of humans, it is difficult to know if this variance is controlled by the genotype of the host or that of the virus because the transmission chain is usually unknown. We apply the phylogenetic comparative approach (PCA) to estimate the heritability of a trait from one infection to the next, which indicates the control of the virus genotype over this trait. The idea is to use viral RNA sequences obtained from patients infected by HIV-1 subtype B to build a phylogeny, which approximately reflects the transmission chain. Heritability is measured statistically as the propensity for patients close in the phylogeny to exhibit similar infection trait values. The approach reveals that up to half of the variance in set-point viral load, a trait associated with virulence, can be heritable. Our estimate is significant and robust to noise in the phylogeny. We also check for the consistency of our approach by showing that a trait related to drug resistance is almost entirely heritable. Finally, we show the importance of taking into account the transmission chain when estimating correlations between infection traits. The fact that HIV virulence is, at least partially, heritable from one infection to the next has clinical and epidemiological implications. The difference between earlier studies and ours comes from the quality of our dataset and from the power of the PCA, which can be applied to large datasets and accounts for within-host evolution. The PCA opens new perspectives for approaches linking clinical data and evolutionary biology because it can be extended to study other traits or other infectious diseases.
Resumo:
SummaryDiscrete data arise in various research fields, typically when the observations are count data.I propose a robust and efficient parametric procedure for estimation of discrete distributions. The estimation is done in two phases. First, a very robust, but possibly inefficient, estimate of the model parameters is computed and used to indentify outliers. Then the outliers are either removed from the sample or given low weights, and a weighted maximum likelihood estimate (WML) is computed.The weights are determined via an adaptive process such that if the data follow the model, then asymptotically no observation is downweighted.I prove that the final estimator inherits the breakdown point of the initial one, and that its influence function at the model is the same as the influence function of the maximum likelihood estimator, which strongly suggests that it is asymptotically fully efficient.The initial estimator is a minimum disparity estimator (MDE). MDEs can be shown to have full asymptotic efficiency, and some MDEs have very high breakdown points and very low bias under contamination. Several initial estimators are considered, and the performances of the WMLs based on each of them are studied.It results that in a great variety of situations the WML substantially improves the initial estimator, both in terms of finite sample mean square error and in terms of bias under contamination. Besides, the performances of the WML are rather stable under a change of the MDE even if the MDEs have very different behaviors.Two examples of application of the WML to real data are considered. In both of them, the necessity for a robust estimator is clear: the maximum likelihood estimator is badly corrupted by the presence of a few outliers.This procedure is particularly natural in the discrete distribution setting, but could be extended to the continuous case, for which a possible procedure is sketched.RésuméLes données discrètes sont présentes dans différents domaines de recherche, en particulier lorsque les observations sont des comptages.Je propose une méthode paramétrique robuste et efficace pour l'estimation de distributions discrètes. L'estimation est faite en deux phases. Tout d'abord, un estimateur très robuste des paramètres du modèle est calculé, et utilisé pour la détection des données aberrantes (outliers). Cet estimateur n'est pas nécessairement efficace. Ensuite, soit les outliers sont retirés de l'échantillon, soit des faibles poids leur sont attribués, et un estimateur du maximum de vraisemblance pondéré (WML) est calculé.Les poids sont déterminés via un processus adaptif, tel qu'asymptotiquement, si les données suivent le modèle, aucune observation n'est dépondérée.Je prouve que le point de rupture de l'estimateur final est au moins aussi élevé que celui de l'estimateur initial, et que sa fonction d'influence au modèle est la même que celle du maximum de vraisemblance, ce qui suggère que cet estimateur est pleinement efficace asymptotiquement.L'estimateur initial est un estimateur de disparité minimale (MDE). Les MDE sont asymptotiquement pleinement efficaces, et certains d'entre eux ont un point de rupture très élevé et un très faible biais sous contamination. J'étudie les performances du WML basé sur différents MDEs.Le résultat est que dans une grande variété de situations le WML améliore largement les performances de l'estimateur initial, autant en terme du carré moyen de l'erreur que du biais sous contamination. De plus, les performances du WML restent assez stables lorsqu'on change l'estimateur initial, même si les différents MDEs ont des comportements très différents.Je considère deux exemples d'application du WML à des données réelles, où la nécessité d'un estimateur robuste est manifeste : l'estimateur du maximum de vraisemblance est fortement corrompu par la présence de quelques outliers.La méthode proposée est particulièrement naturelle dans le cadre des distributions discrètes, mais pourrait être étendue au cas continu.
Resumo:
We study the effect of strong heterogeneities on the fracture of disordered materials using a fiber bundle model. The bundle is composed of two subsets of fibers, i.e. a fraction 0 ≤ α ≤ 1 of fibers is unbreakable, while the remaining 1 - α fraction is characterized by a distribution of breaking thresholds. Assuming global load sharing, we show analytically that there exists a critical fraction of the components αc which separates two qualitatively diferent regimes of the system: below αc the burst size distribution is a power law with the usual exponent Ƭ= 5/2, while above αc the exponent switches to a lower value Ƭ = 9/4 and a cutoff function occurs with a diverging characteristic size. Analyzing the macroscopic response of the system we demonstrate that the transition is conditioned to disorder distributions where the constitutive curve has a single maximum and an inflexion point defining a novel universality class of breakdown phenomena
Resumo:
We previously reported that nuclear grade assignment of prostate carcinomas is subject to a cognitive bias induced by the tumor architecture. Here, we asked whether this bias is mediated by the non-conscious selection of nuclei that "match the expectation" induced by the inadvertent glance at the tumor architecture. 20 pathologists were asked to grade nuclei in high power fields of 20 prostate carcinomas displayed on a computer screen. Unknown to the pathologists, each carcinoma was shown twice, once before a background of a low grade, tubule-rich carcinoma and once before the background of a high grade, solid carcinoma. Eye tracking allowed to identify which nuclei the pathologists fixated during the 8 second projection period. For all 20 pathologists, nuclear grade assignment was significantly biased by tumor architecture. Pathologists tended to fixate on bigger, darker, and more irregular nuclei when those were projected before kigh grade, solid carcinomas than before low grade, tubule-rich carcinomas (and vice versa). However, the morphometric differences of the selected nuclei accounted for only 11% of the architecture-induced bias, suggesting that it can only to a small part be explained by the unconscious fixation on nuclei that "match the expectation". In conclusion, selection of « matching nuclei » represents an unconscious effort to vindicate the gravitation of nuclear grades towards the tumor architecture.
Resumo:
The analysis of the multiantenna capacity in the high-SNR regime has hitherto focused on the high-SNR slope (or maximum multiplexing gain), which quantifies the multiplicative increase as function of the number of antennas. This traditional characterization is unable to assess the impact of prominent channel features since, for a majority of channels, the slope equals the minimum of the number of transmit and receive antennas. Furthermore, a characterization based solely on the slope captures only the scaling but it has no notion of the power required for a certain capacity. This paper advocates a more refined characterization whereby, as function of SNRjdB, the high-SNR capacity is expanded as an affine function where the impact of channel features such as antenna correlation, unfaded components, etc, resides in the zero-order term or power offset. The power offset, for which we find insightful closed-form expressions, is shown to play a chief role for SNR levels of practical interest.
Resumo:
This paper formulates power allocation policies that maximize the region of mutual informationsachievable in multiuser downlink OFDM channels. Arbitrary partitioning ofthe available tones among users and arbitrary modulation formats, possibly different forevery user, are considered. Two distinct policies are derived, respectively for slow fadingchannels tracked instantaneously by the transmitter and for fast fading channels knownonly statistically thereby. With instantaneous channel tracking, the solution adopts theform of a multiuser mercury/waterfilling procedure that generalizes the single-user mercury/waterfilling introduced in [1, 2]. With only statistical channel information, in contrast,the mercury/waterfilling interpretation is lost. For both policies, a number of limitingregimes are explored and illustrative examples are provided.
Resumo:
We present a novel approach to N-person bargaining, based on the idea thatthe agreement reached in a negotiation is determined by how the directconflict resulting from disagreement would be resolved. Our basic buildingblock is the disagreement function, which maps each set of feasible outcomesinto a disagreement point. Using this function and a weak axiom basedon individual rationality we reach a unique solution: the agreement inthe shadow of conflict, ASC. This agreement may be construed as the limitof a sequence of partial agreements, each of which is reached as a functionof the parties relative power. We examine the connection between ASC andasymmetric Nash solutions. We show the connection between the power ofthe parties embodied in the ASC solution and the bias in the SWF thatwould select ASC as an asymmetric Nash solution.
Resumo:
This school in the course of Marketing Business Management and specifically Entrepr This school in the course of Marketing Business Management and specifically Entrepreneurship in the discipline of Simulation - Games Marketing year was accordingly for the creation of a company in the computer business in business online simulator called Marketplace, in order to put into practice all the theoretical knowledge acquired during all previous semesters. This platform we were confronted with decisions in eight quarters corresponding 4 every year , in order to encourage learning in a practical way, a virtual and dynamic environment. Every quarter acareados with well organized tasks taking as a reference point defined strategies such as market research analysis, branding , store management after its creation , development of the policy of the 4Ps , identifying opportunities , monitoring of finances and invest heavily . All quarters were subjected decisions and are then given the results , such as: market performance , financial performance, investments in the future , the "health" of the company 's marketing efficiency then analyzed by our company , teaching and also by competition Balanced Scorecard ie , semi-annual and cumulative . For the start of activities it was awarded the 1st year a total of 2,000,000, corresponding to 500,000 out of 4 first quarter , and 5,000,000 in the fifth quarter in a total of 7,000,000 . The capital invested was used to buy market research, opening sales offices , create brands , contract sales force , advertise products created and perform activity R & D in order to make a profit and become self- sufficient to guarantee the payment of principal invested to headquarters ( Corporate Headquarters ) .
Resumo:
Three-dimensional imaging for the quantification of myocardial motion is a key step in the evaluation of cardiac disease. A tagged magnetic resonance imaging method that automatically tracks myocardial displacement in three dimensions is presented. Unlike other techniques, this method tracks both in-plane and through-plane motion from a single image plane without affecting the duration of image acquisition. A small z-encoding gradient is subsequently added to the refocusing lobe of the slice-selection gradient pulse in a slice following CSPAMM acquisition. An opposite polarity z-encoding gradient is added to the orthogonal tag direction. The additional z-gradients encode the instantaneous through plane position of the slice. The vertical and horizontal tags are used to resolve in-plane motion, while the added z-gradients is used to resolve through-plane motion. Postprocessing automatically decodes the acquired data and tracks the three-dimensional displacement of every material point within the image plane for each cine frame. Experiments include both a phantom and in vivo human validation. These studies demonstrate that the simultaneous extraction of both in-plane and through-plane displacements and pathlines from tagged images is achievable. This capability should open up new avenues for the automatic quantification of cardiac motion and strain for scientific and clinical purposes.
Resumo:
Monitoring thunderstorms activity is an essential part of operational weather surveillance given their potential hazards, including lightning, hail, heavy rainfall, strong winds or even tornadoes. This study has two main objectives: firstly, the description of a methodology, based on radar and total lightning data to characterise thunderstorms in real-time; secondly, the application of this methodology to 66 thunderstorms that affected Catalonia (NE Spain) in the summer of 2006. An object-oriented tracking procedure is employed, where different observation data types generate four different types of objects (radar 1-km CAPPI reflectivity composites, radar reflectivity volumetric data, cloud-to-ground lightning data and intra-cloud lightning data). In the framework proposed, these objects are the building blocks of a higher level object, the thunderstorm. The methodology is demonstrated with a dataset of thunderstorms whose main characteristics, along the complete life cycle of the convective structures (development, maturity and dissipation), are described statistically. The development and dissipation stages present similar durations in most cases examined. On the contrary, the duration of the maturity phase is much more variable and related to the thunderstorm intensity, defined here in terms of lightning flash rate. Most of the activity of IC and CG flashes is registered in the maturity stage. In the development stage little CG flashes are observed (2% to 5%), while for the dissipation phase is possible to observe a few more CG flashes (10% to 15%). Additionally, a selection of thunderstorms is used to examine general life cycle patterns, obtained from the analysis of normalized (with respect to thunderstorm total duration and maximum value of variables considered) thunderstorm parameters. Among other findings, the study indicates that the normalized duration of the three stages of thunderstorm life cycle is similar in most thunderstorms, with the longest duration corresponding to the maturity stage (approximately 80% of the total time).
Resumo:
ABSTRACT The removal of thick layers of soil under native scrubland (Cerrado) on the right bank of the Paraná River in Selvíria (State of Mato Grosso do Sul, Brazil) for construction of the Ilha Solteira Hydroelectric Power Plant caused environmental damage, affecting the revegetation process of the stripped soil. Over the years, various kinds of land use and management systems have been tried, and the aim of this study was to assess the effects of these attempts to restore the structural quality of the soil. The experiment was conducted considering five treatments and thirty replications. The following treatments were applied: stripped soil without anthropic intervention and total absence of plant cover; stripped soil treated with sewage sludge and planted to eucalyptus and grass a year ago; stripped soil developing natural secondary vegetation (capoeira) since 1969; pastureland since 1978, replacing the native vegetation; and soil under native vegetation (Cerrado). In the 0.00-0.20 m layer, the soil was chemically characterized for each experimental treatment. A 30-point sampling grid was used to assess soil porosity and bulk density, and to assess aggregate stability in terms of mean weight diameter (MWD) and geometric mean diameter (GMD). Aggregate stability was also determined using simulated rainfall. The results show that using sewage sludge incorporated with a rotary hoe improved the chemical fertility of the soil and produced more uniform soil pore size distribution. Leaving the land to develop secondary vegetation or turning it over to pastureland produced an intermediate level of structural soil quality, and these two treatments produced similar results. Stripped soil without anthropic intervention was of the lowest quality, with the lowest values for cation exchange capacity (CEC) and macroporosity, as well as the highest values of soil bulk density and percentage of aggregates with diameter size <0.50 mm, corroborated by its lower organic matter content. However, the percentage of larger aggregates was higher in the native vegetation treatment, which boosted MWD and GMD values. Therefore, assessment of some land use and management systems show that even decades after their implementation to mitigate the degenerative effects resulting from the installation of the Hydroelectric Plant, more efficient approaches are still required to recover the structural quality of the soil.
Resumo:
Three-dimensional imaging and quantification of myocardial function are essential steps in the evaluation of cardiac disease. We propose a tagged magnetic resonance imaging methodology called zHARP that encodes and automatically tracks myocardial displacement in three dimensions. Unlike other motion encoding techniques, zHARP encodes both in-plane and through-plane motion in a single image plane without affecting the acquisition speed. Postprocessing unravels this encoding in order to directly track the 3-D displacement of every point within the image plane throughout an entire image sequence. Experimental results include a phantom validation experiment, which compares zHARP to phase contrast imaging, and an in vivo study of a normal human volunteer. Results demonstrate that the simultaneous extraction of in-plane and through-plane displacements from tagged images is feasible.
Resumo:
Climate refers to the long-term course or condition of weather, usually over a time scale of decades and longer. It has been documented that our global climate is changing (IPCC 2007, Copenhagen Diagnosis 2009), and Iowa is no exception. In Iowa, statistically significant changes in our precipitation, streamflow, nighttime minimum temperatures, winter average temperatures, and dewpoint humidity readings have occurred during the past few decades. Iowans are already living with warmer winters, longer growing seasons, warmer nights, higher dew-point temperatures, increased humidity, greater annual streamflows, and more frequent severe precipitation events (Fig. 1-1) than were prevalent during the past 50 years. Some of the impacts of these changes could be construed as positive, and some are negative, particularly the tendency for greater precipitation events and flooding. In the near-term, we may expect these trends to continue as long as climate change is prolonged and exacerbated by increasing greenhouse gas emissions globally from the use of fossil fuels and fertilizers, the clearing of land, and agricultural and industrial emissions. This report documents the impacts of changing climate on Iowa during the past 50 years. It seeks to answer the question, “What are the impacts of climate change in Iowa that have been observed already?” And, “What are the effects on public health, our flora and fauna, agriculture, and the general economy of Iowa?”
Resumo:
A new paint testing device was built to determine the resistance of paints to darkening due to road grime being tracked onto them. The device consists of a tire rotating on a sample drum. Soil was applied to the tire and then tracked onto paint samples which were attached to the drum. A colorimeter was used to measure the lightness of the paints after being tracked. Lightness is measured from 0 (absolute black) to 100 (absolute white). Four experiments were run to determine the optimum time length to track a sample, the reproducibility, the effects of different soils, and the maximum acceptable level for darkening of a paint. The following conclusions were reached: 1) the optimum tracking time was 10 minutes; 2) the reproducibility had a standard deviation of 1.5 lightness units; 3) different soils did not have a large effect on the amount of darkening on the paints; 4) a maximum acceptable darkness could not be established based on the limited amount of data; and 5) a correlation exists between the paints which were darkening in the field and the paints which were turning the darkest on the tracking wheel.
Resumo:
When decommissioning a nuclear facility it is important to be able to estimate activity levels of potentially radioactive samples and compare with clearance values defined by regulatory authorities. This paper presents a method of calibrating a clearance box monitor based on practical experimental measurements and Monte Carlo simulations. Adjusting the simulation for experimental data obtained using a simple point source permits the computation of absolute calibration factors for more complex geometries with an accuracy of a bit more than 20%. The uncertainty of the calibration factor can be improved to about 10% when the simulation is used relatively, in direct comparison with a measurement performed in the same geometry but with another nuclide. The simulation can also be used to validate the experimental calibration procedure when the sample is supposed to be homogeneous but the calibration factor is derived from a plate phantom. For more realistic geometries, like a small gravel dumpster, Monte Carlo simulation shows that the calibration factor obtained with a larger homogeneous phantom is correct within about 20%, if sample density is taken as the influencing parameter. Finally, simulation can be used to estimate the effect of a contamination hotspot. The research supporting this paper shows that activity could be largely underestimated in the event of a centrally-located hotspot and overestimated for a peripherally-located hotspot if the sample is assumed to be homogeneously contaminated. This demonstrates the usefulness of being able to complement experimental methods with Monte Carlo simulations in order to estimate calibration factors that cannot be directly measured because of a lack of available material or specific geometries.