898 resultados para Maximum Power Point Tracking algorithms
Resumo:
This paper formulates power allocation policies that maximize the region of mutual informationsachievable in multiuser downlink OFDM channels. Arbitrary partitioning ofthe available tones among users and arbitrary modulation formats, possibly different forevery user, are considered. Two distinct policies are derived, respectively for slow fadingchannels tracked instantaneously by the transmitter and for fast fading channels knownonly statistically thereby. With instantaneous channel tracking, the solution adopts theform of a multiuser mercury/waterfilling procedure that generalizes the single-user mercury/waterfilling introduced in [1, 2]. With only statistical channel information, in contrast,the mercury/waterfilling interpretation is lost. For both policies, a number of limitingregimes are explored and illustrative examples are provided.
Resumo:
This paper presents several algorithms for joint estimation of the target number and state in a time-varying scenario. Building on the results presented in [1], which considers estimation of the target number only, we assume that not only the target number, but also their state evolution must be estimated. In this context, we extend to this new scenario the Rao-Blackwellization procedure of [1] to compute Bayes recursions, thus defining reduced-complexity solutions for the multi-target set estimator. A performance assessmentis finally given both in terms of Circular Position Error Probability - aimed at evaluating the accuracy of the estimated track - and in terms of Cardinality Error Probability, aimed at evaluating the reliability of the target number estimates.
Resumo:
AIM: To document the feasibility and report the results of dosing darbepoetin-alpha at extended intervals up to once monthly (QM) in a large dialysis patient population. MATERIAL: 175 adult patients treated, at 23 Swiss hemodialysis centres, with stable doses of any erythropoiesis-stimulating agent who were switched by their physicians to darbepoetin-alpha treatment at prolonged dosing intervals (every 2 weeks [Q2W] or QM). METHOD: Multicentre, prospective, observational study. Patients' hemoglobin (Hb) levels and other data were recorded 1 month before conversion (baseline) to an extended darbepoetin-alpha dosing interval, at the time of conversion, and once monthly thereafter up to the evaluation point (maximum of 12 months or until loss to follow-up). RESULTS: Data for 161 evaluable patients from 23 sites were included in the final analysis. At 1 month prior to conversion, 73% of these patients were receiving darbepoetin-alpha weekly (QW) and 27% of the patients biweekly (Q2W). After a mean follow-up of 9.5 months, 34% received a monthly (QM) dosing regimen, 52% of the patients were receiving darbepoetin-alpha Q2W, and 14% QW. The mean (SD) Hb concentration at baseline was 12.3 +/- 1.2 g/dl, compared to 11.9 +/- 1.2 g/dl at the evaluation point. The corresponding mean weekly darbepoetin-alpha dose was 44.3 +/- 33.4 microg at baseline and 37.7 +/- 30.8 microg at the evaluation point. CONCLUSIONS: Conversion to extended darbepoetin-alpha dosing intervals of up to QM, with maintenance of initial Hb concentrations, was successful for the majority of stable dialysis patients.
Resumo:
We present a novel approach to N-person bargaining, based on the idea thatthe agreement reached in a negotiation is determined by how the directconflict resulting from disagreement would be resolved. Our basic buildingblock is the disagreement function, which maps each set of feasible outcomesinto a disagreement point. Using this function and a weak axiom basedon individual rationality we reach a unique solution: the agreement inthe shadow of conflict, ASC. This agreement may be construed as the limitof a sequence of partial agreements, each of which is reached as a functionof the parties relative power. We examine the connection between ASC andasymmetric Nash solutions. We show the connection between the power ofthe parties embodied in the ASC solution and the bias in the SWF thatwould select ASC as an asymmetric Nash solution.
Resumo:
This school in the course of Marketing Business Management and specifically Entrepr This school in the course of Marketing Business Management and specifically Entrepreneurship in the discipline of Simulation - Games Marketing year was accordingly for the creation of a company in the computer business in business online simulator called Marketplace, in order to put into practice all the theoretical knowledge acquired during all previous semesters. This platform we were confronted with decisions in eight quarters corresponding 4 every year , in order to encourage learning in a practical way, a virtual and dynamic environment. Every quarter acareados with well organized tasks taking as a reference point defined strategies such as market research analysis, branding , store management after its creation , development of the policy of the 4Ps , identifying opportunities , monitoring of finances and invest heavily . All quarters were subjected decisions and are then given the results , such as: market performance , financial performance, investments in the future , the "health" of the company 's marketing efficiency then analyzed by our company , teaching and also by competition Balanced Scorecard ie , semi-annual and cumulative . For the start of activities it was awarded the 1st year a total of 2,000,000, corresponding to 500,000 out of 4 first quarter , and 5,000,000 in the fifth quarter in a total of 7,000,000 . The capital invested was used to buy market research, opening sales offices , create brands , contract sales force , advertise products created and perform activity R & D in order to make a profit and become self- sufficient to guarantee the payment of principal invested to headquarters ( Corporate Headquarters ) .
Resumo:
Tot seguit presentem un entorn per analitzar senyals de tot tipus amb LDB (Local Discriminant Bases) i MLDB (Modified Local Discriminant Bases). Aquest entorn utilitza funcions desenvolupades en el marc d’una tesi en fase de desenvolupament. Per entendre part d’aquestes funcions es requereix un nivell de coneixement avançat de processament de senyals. S’han extret dels treballs realitzats per Naoki Saito [3], que s’han agafat com a punt de partida per la realització de l’algorisme de la tesi doctoral no finalitzada de Jose Antonio Soria. Aquesta interfície desenvolupada accepta la incorporació de nous paquets i funcions. Hem deixat un menú preparat per integrar Sinus IV packet transform i Cosine IV packet transform, tot i que també podem incorporar-n’hi altres. L’aplicació consta de dues interfícies, un Assistent i una interfície principal. També hem creat una finestra per importar i exportar les variables desitjades a diferents entorns. Per fer aquesta aplicació s’han programat tots els elements de les finestres, en lloc d’utilitzar el GUIDE (Graphical User Interface Development Enviroment) de MATLAB, per tal que sigui compatible entre les diferents versions d’aquest programa. En total hem fet 73 funcions en la interfície principal (d’aquestes, 10 pertanyen a la finestra d’importar i exportar) i 23 en la de l’Assistent. En aquest treball només explicarem 6 funcions i les 3 de creació d’aquestes interfícies per no fer-lo excessivament extens. Les funcions que explicarem són les més importants, ja sigui perquè s’utilitzen sovint, perquè, segons la complexitat McCabe, són les més complicades o perquè són necessàries pel processament del senyal. Passem cada entrada de dades per part de l’usuari per funcions que ens detectaran errors en aquesta entrada, com eliminació de zeros o de caràcters que no siguin números, com comprovar que són enters o que estan dins dels límits màxims i mínims que li pertoquen.
Resumo:
Monitoring thunderstorms activity is an essential part of operational weather surveillance given their potential hazards, including lightning, hail, heavy rainfall, strong winds or even tornadoes. This study has two main objectives: firstly, the description of a methodology, based on radar and total lightning data to characterise thunderstorms in real-time; secondly, the application of this methodology to 66 thunderstorms that affected Catalonia (NE Spain) in the summer of 2006. An object-oriented tracking procedure is employed, where different observation data types generate four different types of objects (radar 1-km CAPPI reflectivity composites, radar reflectivity volumetric data, cloud-to-ground lightning data and intra-cloud lightning data). In the framework proposed, these objects are the building blocks of a higher level object, the thunderstorm. The methodology is demonstrated with a dataset of thunderstorms whose main characteristics, along the complete life cycle of the convective structures (development, maturity and dissipation), are described statistically. The development and dissipation stages present similar durations in most cases examined. On the contrary, the duration of the maturity phase is much more variable and related to the thunderstorm intensity, defined here in terms of lightning flash rate. Most of the activity of IC and CG flashes is registered in the maturity stage. In the development stage little CG flashes are observed (2% to 5%), while for the dissipation phase is possible to observe a few more CG flashes (10% to 15%). Additionally, a selection of thunderstorms is used to examine general life cycle patterns, obtained from the analysis of normalized (with respect to thunderstorm total duration and maximum value of variables considered) thunderstorm parameters. Among other findings, the study indicates that the normalized duration of the three stages of thunderstorm life cycle is similar in most thunderstorms, with the longest duration corresponding to the maturity stage (approximately 80% of the total time).
Resumo:
Neuronal oscillations are an important aspect of EEG recordings. These oscillations are supposed to be involved in several cognitive mechanisms. For instance, oscillatory activity is considered a key component for the top-down control of perception. However, measuring this activity and its influence requires precise extraction of frequency components. This processing is not straightforward. Particularly, difficulties with extracting oscillations arise due to their time-varying characteristics. Moreover, when phase information is needed, it is of the utmost importance to extract narrow-band signals. This paper presents a novel method using adaptive filters for tracking and extracting these time-varying oscillations. This scheme is designed to maximize the oscillatory behavior at the output of the adaptive filter. It is then capable of tracking an oscillation and describing its temporal evolution even during low amplitude time segments. Moreover, this method can be extended in order to track several oscillations simultaneously and to use multiple signals. These two extensions are particularly relevant in the framework of EEG data processing, where oscillations are active at the same time in different frequency bands and signals are recorded with multiple sensors. The presented tracking scheme is first tested with synthetic signals in order to highlight its capabilities. Then it is applied to data recorded during a visual shape discrimination experiment for assessing its usefulness during EEG processing and in detecting functionally relevant changes. This method is an interesting additional processing step for providing alternative information compared to classical time-frequency analyses and for improving the detection and analysis of cross-frequency couplings.
Resumo:
ABSTRACT The removal of thick layers of soil under native scrubland (Cerrado) on the right bank of the Paraná River in Selvíria (State of Mato Grosso do Sul, Brazil) for construction of the Ilha Solteira Hydroelectric Power Plant caused environmental damage, affecting the revegetation process of the stripped soil. Over the years, various kinds of land use and management systems have been tried, and the aim of this study was to assess the effects of these attempts to restore the structural quality of the soil. The experiment was conducted considering five treatments and thirty replications. The following treatments were applied: stripped soil without anthropic intervention and total absence of plant cover; stripped soil treated with sewage sludge and planted to eucalyptus and grass a year ago; stripped soil developing natural secondary vegetation (capoeira) since 1969; pastureland since 1978, replacing the native vegetation; and soil under native vegetation (Cerrado). In the 0.00-0.20 m layer, the soil was chemically characterized for each experimental treatment. A 30-point sampling grid was used to assess soil porosity and bulk density, and to assess aggregate stability in terms of mean weight diameter (MWD) and geometric mean diameter (GMD). Aggregate stability was also determined using simulated rainfall. The results show that using sewage sludge incorporated with a rotary hoe improved the chemical fertility of the soil and produced more uniform soil pore size distribution. Leaving the land to develop secondary vegetation or turning it over to pastureland produced an intermediate level of structural soil quality, and these two treatments produced similar results. Stripped soil without anthropic intervention was of the lowest quality, with the lowest values for cation exchange capacity (CEC) and macroporosity, as well as the highest values of soil bulk density and percentage of aggregates with diameter size <0.50 mm, corroborated by its lower organic matter content. However, the percentage of larger aggregates was higher in the native vegetation treatment, which boosted MWD and GMD values. Therefore, assessment of some land use and management systems show that even decades after their implementation to mitigate the degenerative effects resulting from the installation of the Hydroelectric Plant, more efficient approaches are still required to recover the structural quality of the soil.
Resumo:
Climate refers to the long-term course or condition of weather, usually over a time scale of decades and longer. It has been documented that our global climate is changing (IPCC 2007, Copenhagen Diagnosis 2009), and Iowa is no exception. In Iowa, statistically significant changes in our precipitation, streamflow, nighttime minimum temperatures, winter average temperatures, and dewpoint humidity readings have occurred during the past few decades. Iowans are already living with warmer winters, longer growing seasons, warmer nights, higher dew-point temperatures, increased humidity, greater annual streamflows, and more frequent severe precipitation events (Fig. 1-1) than were prevalent during the past 50 years. Some of the impacts of these changes could be construed as positive, and some are negative, particularly the tendency for greater precipitation events and flooding. In the near-term, we may expect these trends to continue as long as climate change is prolonged and exacerbated by increasing greenhouse gas emissions globally from the use of fossil fuels and fertilizers, the clearing of land, and agricultural and industrial emissions. This report documents the impacts of changing climate on Iowa during the past 50 years. It seeks to answer the question, “What are the impacts of climate change in Iowa that have been observed already?” And, “What are the effects on public health, our flora and fauna, agriculture, and the general economy of Iowa?”
Resumo:
Images of myocardial strain can be used to diagnose heart disease, plan and monitor treatment, and to learn about cardiac structure and function. Three-dimensional (3D) strain is typically quantified using many magnetic resonance (MR) images obtained in two or three orthogonal planes. Problems with this approach include long scan times, image misregistration, and through-plane motion. This article presents a novel method for calculating cardiac 3D strain using a stack of two or more images acquired in only one orientation. The zHARP pulse sequence encodes in-plane motion using MR tagging and out-of-plane motion using phase encoding, and has been previously shown to be capable of computing 3D displacement within a single image plane. Here, data from two adjacent image planes are combined to yield a 3D strain tensor at each pixel; stacks of zHARP images can be used to derive stacked arrays of 3D strain tensors without imaging multiple orientations and without numerical interpolation. The performance and accuracy of the method is demonstrated in vitro on a phantom and in vivo in four healthy adult human subjects.
Resumo:
A new paint testing device was built to determine the resistance of paints to darkening due to road grime being tracked onto them. The device consists of a tire rotating on a sample drum. Soil was applied to the tire and then tracked onto paint samples which were attached to the drum. A colorimeter was used to measure the lightness of the paints after being tracked. Lightness is measured from 0 (absolute black) to 100 (absolute white). Four experiments were run to determine the optimum time length to track a sample, the reproducibility, the effects of different soils, and the maximum acceptable level for darkening of a paint. The following conclusions were reached: 1) the optimum tracking time was 10 minutes; 2) the reproducibility had a standard deviation of 1.5 lightness units; 3) different soils did not have a large effect on the amount of darkening on the paints; 4) a maximum acceptable darkness could not be established based on the limited amount of data; and 5) a correlation exists between the paints which were darkening in the field and the paints which were turning the darkest on the tracking wheel.
Resumo:
The state of the art to describe image quality in medical imaging is to assess the performance of an observer conducting a task of clinical interest. This can be done by using a model observer leading to a figure of merit such as the signal-to-noise ratio (SNR). Using the non-prewhitening (NPW) model observer, we objectively characterised the evolution of its figure of merit in various acquisition conditions. The NPW model observer usually requires the use of the modulation transfer function (MTF) as well as noise power spectra. However, although the computation of the MTF poses no problem when dealing with the traditional filtered back-projection (FBP) algorithm, this is not the case when using iterative reconstruction (IR) algorithms, such as adaptive statistical iterative reconstruction (ASIR) or model-based iterative reconstruction (MBIR). Given that the target transfer function (TTF) had already shown it could accurately express the system resolution even with non-linear algorithms, we decided to tune the NPW model observer, replacing the standard MTF by the TTF. It was estimated using a custom-made phantom containing cylindrical inserts surrounded by water. The contrast differences between the inserts and water were plotted for each acquisition condition. Then, mathematical transformations were performed leading to the TTF. As expected, the first results showed a dependency of the image contrast and noise levels on the TTF for both ASIR and MBIR. Moreover, FBP also proved to be dependent of the contrast and noise when using the lung kernel. Those results were then introduced in the NPW model observer. We observed an enhancement of SNR every time we switched from FBP to ASIR to MBIR. IR algorithms greatly improve image quality, especially in low-dose conditions. Based on our results, the use of MBIR could lead to further dose reduction in several clinical applications.
Resumo:
When decommissioning a nuclear facility it is important to be able to estimate activity levels of potentially radioactive samples and compare with clearance values defined by regulatory authorities. This paper presents a method of calibrating a clearance box monitor based on practical experimental measurements and Monte Carlo simulations. Adjusting the simulation for experimental data obtained using a simple point source permits the computation of absolute calibration factors for more complex geometries with an accuracy of a bit more than 20%. The uncertainty of the calibration factor can be improved to about 10% when the simulation is used relatively, in direct comparison with a measurement performed in the same geometry but with another nuclide. The simulation can also be used to validate the experimental calibration procedure when the sample is supposed to be homogeneous but the calibration factor is derived from a plate phantom. For more realistic geometries, like a small gravel dumpster, Monte Carlo simulation shows that the calibration factor obtained with a larger homogeneous phantom is correct within about 20%, if sample density is taken as the influencing parameter. Finally, simulation can be used to estimate the effect of a contamination hotspot. The research supporting this paper shows that activity could be largely underestimated in the event of a centrally-located hotspot and overestimated for a peripherally-located hotspot if the sample is assumed to be homogeneously contaminated. This demonstrates the usefulness of being able to complement experimental methods with Monte Carlo simulations in order to estimate calibration factors that cannot be directly measured because of a lack of available material or specific geometries.
Resumo:
Positive selection is widely estimated from protein coding sequence alignments by the nonsynonymous-to-synonymous ratio omega. Increasingly elaborate codon models are used in a likelihood framework for this estimation. Although there is widespread concern about the robustness of the estimation of the omega ratio, more efforts are needed to estimate this robustness, especially in the context of complex models. Here, we focused on the branch-site codon model. We investigated its robustness on a large set of simulated data. First, we investigated the impact of sequence divergence. We found evidence of underestimation of the synonymous substitution rate for values as small as 0.5, with a slight increase in false positives for the branch-site test. When dS increases further, underestimation of dS is worse, but false positives decrease. Interestingly, the detection of true positives follows a similar distribution, with a maximum for intermediary values of dS. Thus, high dS is more of a concern for a loss of power (false negatives) than for false positives of the test. Second, we investigated the impact of GC content. We showed that there is no significant difference of false positives between high GC (up to similar to 80%) and low GC (similar to 30%) genes. Moreover, neither shifts of GC content on a specific branch nor major shifts in GC along the gene sequence generate many false positives. Our results confirm that the branch-site is a very conservative test.