570 resultados para Superiority


Relevância:

10.00% 10.00%

Publicador:

Resumo:

In 1996, the authors of the Canadian Royal Commission on Aboriginal Peoples concluded Canadian educational policy had been based on the false assumption of the superiority of European worldviews. The report authors recommended the transformation of curriculum and schools to recognize that European knowledge was not universal. Aboriginal researcher Battiste believes the current system of Canadian education causes Aboriginal children to face cognitive imperialism and cognitive assimilation and that this current practice of cultural racism in Canada makes educational institutions a hostile environment for Aboriginal learners. In order to counter this cultural racism, Battiste calls for the decolonization of education. In 2005, the president of Northwest Community College (NWCC), publicly committed to decolonizing the college in order to address the continuing disparity in educational attainment between Aboriginal and non-Aboriginal learners. Upon the president’s departure in 2010, the employees of NWCC were left to define for themselves the meaning of decolonization. This qualitative study was designed to build a NWCC definition of colonization and decolonization by collecting researcher observations, nine weeks of participant blog postings, and pre and post blog Word survey responses drawn from a purposeful sample of six Aboriginal and six non-Aboriginal NWCC employees selected from staff, instructor and administrator employee groups. The findings revealed NWCC employees held multiple definitions of colonization and decolonization which did not vary between employee groups, or based on participant gender; however, differences were found based on whether the participants were Aboriginal or non-Aboriginal. Both Aboriginal and non-Aboriginal participants thought decolonization was a worthy goal for the college. Aboriginal participants felt hopeful that decolonization would happen in the future and thought decolonization had to do with moving forward to a time when they would be valued, respected, empowered, unashamed, safe, and viewed as equal to non-Aboriginals. Non-Aboriginal participants were unsure if decolonization was possible because it would require going back in time to restore the Aboriginal way of life. When non-Aboriginal participants felt their thoughts were not being valued or they were being associated with colonialism, they felt angry and guarded and were uncomfortable with Aboriginal participants expressing anger towards Colonizers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The purpose of this article has been made through a Marxist analysis of the US film "Captain Phillips" (PaulGreengrass, 2013), based on a true story. I have found how the evolution of capitalism in the West continuesto consolidate the belief reified in a historical and geographical superiority of the political and socioeconomicwestern models regarding Africa and Asia lowers models. At the same time, through categories like dialecticalmaterialism, criticism of diffusionist theory and application of cognitive mapping to large geopoliticalspaces located in most poor areas of the world, I have realized a remark about currently being articulatingthe political unconscious of working class in rich countries and the poor in poor countries, establishing arelationship between the ideological representation that takes an individual from his historical reality (ona scale that moves from local to global), and how he has developed a mental ability to escape of the responsibilityto make a critical review of what's happening around him in all areas. Finally, through physicalspace captured in the film, I have realized a materialist critique of globalized business process that takesplace through the carriage of goods, outlining spatial and cognitively limits of the mentality of our time, bothamong "winners"as among the "losers", based on the spatial movement of capital.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Este artículo cuantifica la presencia de obra artística de mujeres artistas en 21 museos y centros de arte contemporáneo españoles. Los resultados constatan una nítida sub-representación de la obra exhibida, por debajo del 20 por ciento. ¿Por qué sucede esto?, ¿diferencial potencial artístico de mujeres y hombres?, ¿superioridad masculina?, ¿discriminación? o ¿un sistema de arte con sesgo androcéntrico? En estas páginas se discute sobre la presencia de varios factores para explicar la brecha de género y se reclama, de las administraciones públicas y las instituciones de gestión cultural, el cumplimiento de la Ley para la Igualdad para garantizar la paridad.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

What is the human being? Which is its origin and its end? What is the influence of the nature in the man and what is his impact on nature? Forthe animalists, men are like other animals; freedom and rationality are not signs of superiority, nor having rights over the animals. For the ecohumanists, human beings are part of nature, but is qualitatively different and superior to animals; and is the creator of the civilization. We analyze these two ecological looks. A special point is the contribution ofecohumanists -from the first half of the Renaissance, who dealt in extenso the dignity and freedom of the human being-, of Michelangelo and finally, of Mozart, through his four insurmountable operas, which display the difficulty of physical ecology to engender so much beauty, so much wealth, so much love for the creatures and so much variety.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With Tweet volumes reaching 500 million a day, sampling is inevitable for any application using Twitter data. Realizing this, data providers such as Twitter, Gnip and Boardreader license sampled data streams priced in accordance with the sample size. Big Data applications working with sampled data would be interested in working with a large enough sample that is representative of the universal dataset. Previous work focusing on the representativeness issue has considered ensuring the global occurrence rates of key terms, be reliably estimated from the sample. Present technology allows sample size estimation in accordance with probabilistic bounds on occurrence rates for the case of uniform random sampling. In this paper, we consider the problem of further improving sample size estimates by leveraging stratification in Twitter data. We analyze our estimates through an extensive study using simulations and real-world data, establishing the superiority of our method over uniform random sampling. Our work provides the technical know-how for data providers to expand their portfolio to include stratified sampled datasets, whereas applications are benefited by being able to monitor more topics/events at the same data and computing cost.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recommendation systems aim to help users make decisions more efficiently. The most widely used method in recommendation systems is collaborative filtering, of which, a critical step is to analyze a user's preferences and make recommendations of products or services based on similarity analysis with other users' ratings. However, collaborative filtering is less usable for recommendation facing the "cold start" problem, i.e. few comments being given to products or services. To tackle this problem, we propose an improved method that combines collaborative filtering and data classification. We use hotel recommendation data to test the proposed method. The accuracy of the recommendation is determined by the rankings. Evaluations regarding the accuracies of Top-3 and Top-10 recommendation lists using the 10-fold cross-validation method and ROC curves are conducted. The results show that the Top-3 hotel recommendation list proposed by the combined method has the superiority of the recommendation performance than the Top-10 list under the cold start condition in most of the times.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Providing good customer service, inexpensively, is a problem commonly faced by managers of service operations. To tackle this problem, managers must do four tasks: forecast customer demand for the service; translate these forecasts into employee requirements; develop a labor schedule that provides appropriate numbers of employees at appropriate times; and control the delivery of the service in real-time. This paper focuses upon the translation of forecasts of customer demand into employee requirements. Specifically, it presents and evaluates two methods for determining desired staffing levels. One of these methods is a traditional approach to the task, while the other, by using modified customer arrival rates, offers a better means of accounting for the multi-period impact of customer service. To calculate the modified arrival rates, the latter method reduces (increases) the actual customer arrival rate for a period to account for customers who arrived in the period (in earlier periods) but have some of their service performed in subsequent periods (in the period). In an experiment simulating 13824 service delivery environments, the new method demonstrated its superiority by serving 2.74% more customers within the specified waiting time limit while using 7.57% fewer labor hours.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Individual placement and support (IPS) is effective in helping patients return to work but is poorly implemented because of clinical ambivalence and fears of relapse. Aims To assess whether a motivational intervention (motivational interviewing) directed at clinical staff to address ambivalence about employment improved patients’ occupational outcomes. Method Two of four early intervention teams that already provided IPS were randomised to receive motivational interviewing training for clinicians, focused on attitudinal barriers to employment. The trial was registered with the International Standard Randomised Controlled Trial Register (ISRCTN71943786). Results Of 300 eligible participants, 159 consented to the research. Occupational outcomes were obtained for 134 patients (85%) at 12-month follow-up. More patients in the intervention teams than in the IPS-only teams achieved employment by 12 months (29/68 v. 12/66). A random effects logistic regression accounting for clustering by care coordinator, and adjusted for participants’ gender, ethnicity, educational and employment history and clinical status scores, confirmed superiority of the intervention (odds ratio = 4.3, 95% CI 1.5-16.6). Conclusions Employment outcomes were enhanced by addressing clinicians’ ambivalence about their patients returning to work.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective: Real-time functional magnetic resonance imaging (rt-fMRI) neurofeedback (NF) uses feedback of the patient’s own brain activity to self-regulate brain networks which in turn could lead to a change in behaviour and clinical symptoms. The objective was to determine the effect of neurofeedback and motor training and motor training (MOT) alone on motor and non-motor functions in Parkinson’s disease (PD) in a 10-week small Phase I randomised controlled trial. Methods: 30 patients with PD (Hoehn & Yahr I-III) and no significant comorbidity took part in the trial with random allocation to two groups. Group 1 (NF: 15 patients) received rt-fMRI-NF with motor training. Group 2 (MOT: 15 patients) received motor training alone. The primary outcome measure was the Movement Disorder Society – Unified Parkinson’s Disease Rating Scale-Motor scale (MDS-UPDRS-MS), administered pre- and post-intervention ‘off-medication’. The secondary outcome measures were the ‘on-medication’ MDS-UPDRS, the Parkinson’s disease Questionnaire-39, and quantitative motor assessments after 4 and 10 weeks. Results: Patients in the NF group were able to upregulate activity in the supplementary motor area by using motor imagery. They improved by an average of 4.5 points on the MDS-UPDRS-MS in the ‘off-medication’ state (95% confidence interval: -2.5 to -6.6), whereas the MOT group improved only by 1.9 points (95% confidence interval +3.2 to -6.8). However, the improvement did not differ significantly between the groups. No adverse events were reported in either group. Interpretation: This Phase I study suggests that NF combined with motor training is safe and improves motor symptoms immediately after treatment, but larger trials are needed to explore its superiority over active control conditions. Clinical Trial website : Unique Identifier: NCT01867827 URL: https://clinicaltrials.gov/ct2/show/NCT01867827?term=NCT01867827&rank=1

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Forecast is the basis for making strategic, tactical and operational business decisions. In financial economics, several techniques have been used to predict the behavior of assets over the past decades.Thus, there are several methods to assist in the task of time series forecasting, however, conventional modeling techniques such as statistical models and those based on theoretical mathematical models have produced unsatisfactory predictions, increasing the number of studies in more advanced methods of prediction. Among these, the Artificial Neural Networks (ANN) are a relatively new and promising method for predicting business that shows a technique that has caused much interest in the financial environment and has been used successfully in a wide variety of financial modeling systems applications, in many cases proving its superiority over the statistical models ARIMA-GARCH. In this context, this study aimed to examine whether the ANNs are a more appropriate method for predicting the behavior of Indices in Capital Markets than the traditional methods of time series analysis. For this purpose we developed an quantitative study, from financial economic indices, and developed two models of RNA-type feedfoward supervised learning, whose structures consisted of 20 data in the input layer, 90 neurons in one hidden layer and one given as the output layer (Ibovespa). These models used backpropagation, an input activation function based on the tangent sigmoid and a linear output function. Since the aim of analyzing the adherence of the Method of Artificial Neural Networks to carry out predictions of the Ibovespa, we chose to perform this analysis by comparing results between this and Time Series Predictive Model GARCH, developing a GARCH model (1.1).Once applied both methods (ANN and GARCH) we conducted the results' analysis by comparing the results of the forecast with the historical data and by studying the forecast errors by the MSE, RMSE, MAE, Standard Deviation, the Theil's U and forecasting encompassing tests. It was found that the models developed by means of ANNs had lower MSE, RMSE and MAE than the GARCH (1,1) model and Theil U test indicated that the three models have smaller errors than those of a naïve forecast. Although the ANN based on returns have lower precision indicator values than those of ANN based on prices, the forecast encompassing test rejected the hypothesis that this model is better than that, indicating that the ANN models have a similar level of accuracy . It was concluded that for the data series studied the ANN models show a more appropriate Ibovespa forecasting than the traditional models of time series, represented by the GARCH model

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This dissertation contains four essays that all share a common purpose: developing new methodologies to exploit the potential of high-frequency data for the measurement, modeling and forecasting of financial assets volatility and correlations. The first two chapters provide useful tools for univariate applications while the last two chapters develop multivariate methodologies. In chapter 1, we introduce a new class of univariate volatility models named FloGARCH models. FloGARCH models provide a parsimonious joint model for low frequency returns and realized measures, and are sufficiently flexible to capture long memory as well as asymmetries related to leverage effects. We analyze the performances of the models in a realistic numerical study and on the basis of a data set composed of 65 equities. Using more than 10 years of high-frequency transactions, we document significant statistical gains related to the FloGARCH models in terms of in-sample fit, out-of-sample fit and forecasting accuracy compared to classical and Realized GARCH models. In chapter 2, using 12 years of high-frequency transactions for 55 U.S. stocks, we argue that combining low-frequency exogenous economic indicators with high-frequency financial data improves the ability of conditionally heteroskedastic models to forecast the volatility of returns, their full multi-step ahead conditional distribution and the multi-period Value-at-Risk. Using a refined version of the Realized LGARCH model allowing for time-varying intercept and implemented with realized kernels, we document that nominal corporate profits and term spreads have strong long-run predictive ability and generate accurate risk measures forecasts over long-horizon. The results are based on several loss functions and tests, including the Model Confidence Set. Chapter 3 is a joint work with David Veredas. We study the class of disentangled realized estimators for the integrated covariance matrix of Brownian semimartingales with finite activity jumps. These estimators separate correlations and volatilities. We analyze different combinations of quantile- and median-based realized volatilities, and four estimators of realized correlations with three synchronization schemes. Their finite sample properties are studied under four data generating processes, in presence, or not, of microstructure noise, and under synchronous and asynchronous trading. The main finding is that the pre-averaged version of disentangled estimators based on Gaussian ranks (for the correlations) and median deviations (for the volatilities) provide a precise, computationally efficient, and easy alternative to measure integrated covariances on the basis of noisy and asynchronous prices. Along these lines, a minimum variance portfolio application shows the superiority of this disentangled realized estimator in terms of numerous performance metrics. Chapter 4 is co-authored with Niels S. Hansen, Asger Lunde and Kasper V. Olesen, all affiliated with CREATES at Aarhus University. We propose to use the Realized Beta GARCH model to exploit the potential of high-frequency data in commodity markets. The model produces high quality forecasts of pairwise correlations between commodities which can be used to construct a composite covariance matrix. We evaluate the quality of this matrix in a portfolio context and compare it to models used in the industry. We demonstrate significant economic gains in a realistic setting including short selling constraints and transaction costs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Microsecond long Molecular Dynamics (MD) trajectories of biomolecular processes are now possible due to advances in computer technology. Soon, trajectories long enough to probe dynamics over many milliseconds will become available. Since these timescales match the physiological timescales over which many small proteins fold, all atom MD simulations of protein folding are now becoming popular. To distill features of such large folding trajectories, we must develop methods that can both compress trajectory data to enable visualization, and that can yield themselves to further analysis, such as the finding of collective coordinates and reduction of the dynamics. Conventionally, clustering has been the most popular MD trajectory analysis technique, followed by principal component analysis (PCA). Simple clustering used in MD trajectory analysis suffers from various serious drawbacks, namely, (i) it is not data driven, (ii) it is unstable to noise and change in cutoff parameters, and (iii) since it does not take into account interrelationships amongst data points, the separation of data into clusters can often be artificial. Usually, partitions generated by clustering techniques are validated visually, but such validation is not possible for MD trajectories of protein folding, as the underlying structural transitions are not well understood. Rigorous cluster validation techniques may be adapted, but it is more crucial to reduce the dimensions in which MD trajectories reside, while still preserving their salient features. PCA has often been used for dimension reduction and while it is computationally inexpensive, being a linear method, it does not achieve good data compression. In this thesis, I propose a different method, a nonmetric multidimensional scaling (nMDS) technique, which achieves superior data compression by virtue of being nonlinear, and also provides a clear insight into the structural processes underlying MD trajectories. I illustrate the capabilities of nMDS by analyzing three complete villin headpiece folding and six norleucine mutant (NLE) folding trajectories simulated by Freddolino and Schulten [1]. Using these trajectories, I make comparisons between nMDS, PCA and clustering to demonstrate the superiority of nMDS. The three villin headpiece trajectories showed great structural heterogeneity. Apart from a few trivial features like early formation of secondary structure, no commonalities between trajectories were found. There were no units of residues or atoms found moving in concert across the trajectories. A flipping transition, corresponding to the flipping of helix 1 relative to the plane formed by helices 2 and 3 was observed towards the end of the folding process in all trajectories, when nearly all native contacts had been formed. However, the transition occurred through a different series of steps in all trajectories, indicating that it may not be a common transition in villin folding. The trajectories showed competition between local structure formation/hydrophobic collapse and global structure formation in all trajectories. Our analysis on the NLE trajectories confirms the notion that a tight hydrophobic core inhibits correct 3-D rearrangement. Only one of the six NLE trajectories folded, and it showed no flipping transition. All the other trajectories get trapped in hydrophobically collapsed states. The NLE residues were found to be buried deeply into the core, compared to the corresponding lysines in the villin headpiece, thereby making the core tighter and harder to undo for 3-D rearrangement. Our results suggest that the NLE may not be a fast folder as experiments suggest. The tightness of the hydrophobic core may be a very important factor in the folding of larger proteins. It is likely that chaperones like GroEL act to undo the tight hydrophobic core of proteins, after most secondary structure elements have been formed, so that global rearrangement is easier. I conclude by presenting facts about chaperone-protein complexes and propose further directions for the study of protein folding.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background and Objectives: Schizophrenia is a severe chronic disease. Endpoint variables lack objectivity and the diagnostic criteria have evolved with time. In order to guide the development of new drugs, European Medicines Agency (EMA) issued a guideline on the clinical investigation of medicinal products for the treatment of schizophrenia. Methods: Authors reviewed and discussed the efficacy trial part of the Guideline. Results: The Guideline divides clinical efficacy trials into short-term trials and long-term trials. The short-term three-arm trial is recommended to replace the short-term two-arm active-controlled non-inferiority trial because the latter has sensitivity issues. The Guideline ultimately makes that three-arm trial a superiority trial. The Guideline discusses four types of long-term trial designs. The randomized withdrawal trial design has some disadvantages. Long-term two-arm active-controlled non-inferiority trial is not recommended due to the sensitivity issue. Extension of the short-term trial is only suitable for extension of the short-term two-arm active-controlled superiority trial. The Guideline suggests that a hybrid design of a randomized withdrawal trial incorporated into a long-term parallel trial might be optimal. However, such a design has some disadvantages and might be too complex to be carried out. Authors suggest instead a three-group long-term trial design, which could provide comparison between test drug and active comparator along with comparison between the test drug and placebo. This alternative could arguably be much easier to carry out compared with the hybrid design. Conclusions: The three-group long-term design merits further discussion and evaluation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The peaches and nectarines are highly appreciated by consumer, but it is climacteric fruits, with availability in the market in small time. It is necessary to invest to obtain genotypes with fruit quality and small perishability or that it presente less physiological disorders after storage. The aims of this work were i) to evaluate the genetic divergence among 40 peach and nectarine trees genotypes based on postharvest quality and select posible parents; ii) to evaluate the susceptibility to chilling injury in peaches and nectarines after cold storage; iii) to evaluate divergence of peaches and nectarines on the basis in the susceptibility for chiling injury and select superior genotypes; iv) evaluate the correlations between quality and susceptibility to chilling injury of peaches and nectarines v) select parents with the combination of lower susceptibility to chilling injury and higher quality fruit. The study was carried out in EEAD-CSIC, Zaragoza - Spain, during the production cycle 2013/2014. A total of 40 peaches and nectarines genotypes from germplasm collection were evaluated. The quality characteristics as flesh firmness, total soluble solids, titratable acidity, pH, rippining index and flesh color parameters were evaluated. The fruits were submitted to cold storage at 0 °C and 5 °C, with 95% average relative humidity. The evaluations were after 14 and 28 days, it being observed the presence of symptoms, such as wooliness through mealiness, flesh grainy, leatheriness and flesh color changes, through browning, bleeding and off flavor. As a selection parameter was adopted 20% of genotypes that had a higher frequency of superiority for quality characteristics, susceptibility to chilling injury and the combining of both. For quality characteristic presented greater divergence the ‘Queen Giant’, ‘Sudanel Blanco’ and ‘Borracho de Jarque’. Based on the quality the eight genotypes were selected, ‘Andross’, ‘San Jaime’, ‘San Lorenzo’, ‘Borracho de Jarque’, ‘Sudanell 1’, ‘Carson’, ‘Baby Gold 6’ and ‘Stanford’. All genotypes studied exhibited susceptibility to one or more symptoms caused by cold storage during 28 days, independent of temperature. For 14 days, the ‘Baby Gold 6’, ‘Flavortop’ and ‘Queen Giant’ genotypes did not show any physiological disorder caused by cold. In general, the temperature of 0 °C favored fruit postharvest conservation, it have a lower incidence and severity of symptoms caused by cold storage. The storage for 14 days contributed for the lower incidence of damage in the genotypes fruits studied. For 14 days, with both temperatures, it was observed divergence for ‘Queen Giant’, ‘Sudanell Blanco’, ‘Baby Gold 6’ ‘GF3’, ‘Baby Gold 8’, ‘Campiel’ and ‘Campiel Rojo’ genotypes. For 28 days, in the 5 °C condition, ‘Queen Giant’, ‘Big Top’, ‘Flavortop’ and ‘Redhaven’ genotypes were divergents. Based on susceptibility to chilling injury at 0 °C, the eight genotypes were selected, it being these, ‘Queen Giant’, ‘Keimoes’, ‘Flavortop’, ‘Big Top’, 'Redhaven', 'Sudanell 3', 'Bonet I' and ‘Carson’. The quality parameters as rippining index, soluble solids, firmness and titratable acidity presented correlation among them. These, also it had correlation with woolines and bowning, what it indicate that fruits with more ripening can have this symptoms more easily. The browning, mealiness, flesh grainy and off flavor variables were correlationed with the time period and temperartures, what it confirm that these symptoms are the main disorders caused by cold storage. The quality characteristics together susceptibility to chilling injury allowed selected ‘Baby Gold 6’, ‘Sarell’, ‘Keimoes’, ‘GF3’ ‘San Jaime’, ‘Big Top’, ‘Sudanell 1’, ‘Carson’, ‘Baby Gold 8’, and ‘San Lorenzo’ genotypes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To design strategies for the conservation and use of genetic resources of tree species such as jaboticaba tree, it is essential to make the characterization. In southwestern Paraná region, there are several forest fragments containing native jaboticaba tree (Plinia cauliflora), whose materials have broad potential for commercial orchards or breeding programs. As is the potential genetic diversity of a population to produce different genotypes, it would be able to start in such a characterization one of these fragments. The aim was to characterize fruits of jaboticaba tree (P. caulifora) of forest fragment kept in Clevelândia - PR for the presence of phenotypic variability, seeking to identify those superiors named for future selection as farming or male parent, as well as estimate genetic divergence between them, as a complementary tool for this purpose. Also, verify the regeneration and spatial distribution of the species. For the study was defined portion of a hectare (10.000 m²), with all individuals identified, mapped, with local coordinate system, and measured height and diameter. Fruits were characterized by sensory and biochemical characteristics in two years, 70 genotypes at 2013 and 56 at 2014, and of these 33 genotypes in both years. As a pre-selection criteria was adopted the choice of 20% of the genotypes that showed the highest frequency of superiority in the evaluated characteristics of the fruit. Genetic divergence among 33 genotypes per year was analyzed. The distribution pattern and spatial association was evaluated by Ripley's K function. It was classified for the first time the following ontogenetic stages of jaboticaba tree, by plant height, seedling (from 0.01 to 0.99 m), juvenile (1.0 to 4.99 m), immature (> 5.0 m, non-reproductive), adult (reproductive). It was also have been describe for the first time the naturally occurring juxtaposed seedlings, indicating polyembryony. The number of regenerating identified in the population (seedlings: n = 2163; juveniles: n = 330; immature: n = 59) was much larger than the number of adults (n = 132). The species showed reverse J-shaped size structure standard, with high concentration of regenerating. The regeneration distribution occurs in aggregate pattern and there is seedling-adult dependence, due seed dispersal and seedling emergence closest to mothers. The jaboticaba tree regeneration is sufficient to maintain the species for long term in this population, which should serve as reference to regeneration success for other studies of this important fruiting species from Ombrofile Mixed Forests. Has been pre-selected the jaboticaba trees 7, 42, 43, 47, 54, 91, 97, 104, 105, 118, 134, 153, 154, 157, 163, 169, 177, 186, 212, J7-01 and J7- 02, and 16 and 194 the ones that can now be selected by the superior characteristics of both cycles. It was recommended to carry out hybridization between genotypes 79 and 119, and 96 to 148. The quality of fruit analyzed showed potential for use as a dual purpose serving both in natura market or processing.