915 resultados para New methodology
Resumo:
A new method, based on linear correlation and phase diagrams was successfully developed for processes like the sedimentary process, where the deposition phase can have different time duration - represented by repeated values in a series - and where the erosion can play an important rule deleting values of a series. The sampling process itself can be the cause of repeated values - large strata twice sampled - or deleted values: tiny strata fitted between two consecutive samples. What we developed was a mathematical procedure which, based upon the depth chemical composition evolution, allows the establishment of frontiers as well as the periodicity of different sedimentary environments. The basic tool isn't more than a linear correlation analysis which allow us to detect the existence of eventual evolution rules, connected with cyclical phenomena within time series (considering the space assimilated to time), with the final objective of prevision. A very interesting discovery was the phenomenon of repeated sliding windows that represent quasi-cycles of a series of quasi-periods. An accurate forecast can be obtained if we are inside a quasi-cycle (it is possible to predict the other elements of the cycle with the probability related with the number of repeated and deleted points). We deal with an innovator methodology, reason why it's efficiency is being tested in some case studies, with remarkable results that shows it's efficacy. Keywords: sedimentary environments, sequence stratigraphy, data analysis, time-series, conditional probability.
Resumo:
Dissertation submitted to Faculdade de Ciências e Tecnologia - Universidade Nova de Lisboa in fulfilment of the requirements for the degree of Doctor of Philosophy (Biochemistry - Biotechnology)
Resumo:
Environmental Training in Engineering Education (ENTREE 2001) - integrated green policies: progress for progress, p. 329-339 (Florence, 14-17 November 2001; proceedings published as book)
Resumo:
This paper presents an electricity medium voltage (MV) customer characterization framework supportedby knowledge discovery in database (KDD). The main idea is to identify typical load profiles (TLP) of MVconsumers and to develop a rule set for the automatic classification of new consumers. To achieve ourgoal a methodology is proposed consisting of several steps: data pre-processing; application of severalclustering algorithms to segment the daily load profiles; selection of the best partition, corresponding tothe best consumers’ segmentation, based on the assessments of several clustering validity indices; andfinally, a classification model is built based on the resulting clusters. To validate the proposed framework,a case study which includes a real database of MV consumers is performed.
Resumo:
An intensive use of dispersed energy resources is expected for future power systems, including distributed generation, especially based on renewable sources, and electric vehicles. The system operation methods and tool must be adapted to the increased complexity, especially the optimal resource scheduling problem. Therefore, the use of metaheuristics is required to obtain good solutions in a reasonable amount of time. This paper proposes two new heuristics, called naive electric vehicles charge and discharge allocation and generation tournament based on cost, developed to obtain an initial solution to be used in the energy resource scheduling methodology based on simulated annealing previously developed by the authors. The case study considers two scenarios with 1000 and 2000 electric vehicles connected in a distribution network. The proposed heuristics are compared with a deterministic approach and presenting a very small error concerning the objective function with a low execution time for the scenario with 2000 vehicles.
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Mecânica
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Electrotécnica e de Computadores
Resumo:
The GxE interaction only became widely discussed from evolutionary studies and evaluations of the causes of behavioral changes of species cultivated in environments. In the last 60 years, several methodologies for the study of adaptability and stability of genotypes in multiple environments trials were developed in order to assist the breeder's choice regarding which genotypes are more stable and which are the most suitable for the crops in the most diverse environments. The methods that use linear regression analysis were the first to be used in a general way by breeders, followed by multivariate analysis methods and mixed models. The need to identify the genetic and environmental causes that are behind the GxE interaction led to the development of new models that include the use of covariates and which can also include both multivariate methods and mixed modeling. However, further studies are needed to identify the causes of GxE interaction as well as for the more accurate measurement of its effects on phenotypic expression of varieties in competition trials carried out in genetic breeding programs.
Resumo:
PURPOSE: To measure the thickness of adductor pollicis muscle in healthy adults. This measurement will be used as a nutritional anthropometric parameter in further studies. SUBJECTS AND METHOD: Four hundred and twenty-one healthy adults were studied, 209 men and 212 women, with ages ranging from 18 to 87 years, living in Rio de Janeiro. The adductor pollicis muscle was also studied in the human anatomy lab as well as in normal healthy volunteers using CAT scans and nuclear magnetic resonance imaging to ensure that only the adductor pollicis was included in measurement of muscle thickness with a Lange caliper. To standardize the measurement, the methodology was detailed, with subjects sitting with the dominant hand dangling over the homolateral thigh and the elbow bent at approximately a 90° angle. The Lange caliper was applied at a pressure of 10 g/mm², pinching the adductor pollicis muscle at the vertex of an imaginary angle between the thumb and the index finger. The average of 3 consecutive measurements was considered to be the muscle thickness. RESULTS: This study provides the first estimates of adductor pollicis thickness in normal healthy subjects as an anthropometric parameter. The normal values in the dominant hand for men were 12.5 ± 2.8 mm (mean ± SD), median 12 mm, and for women were 10.5 ± 2.3 mm, median 10 mm.
Resumo:
Multiarm star polymers are attractive materials due to their unusual bulk and solution properties. They are considered analogues of dendrimers with a wide range of applications, such as drug delivery, membranes, coatings and lithography.1 The advent of controlled polymerization made possible the existence of this unique class of organic nanoparticles (ONPs).2 Two major synthetic strategies are usually employed in the preparation of star polymers, the core-first and arm-first approaches. The core-first approach involves a controlled living polymerization using a multiarm initiator core while the arm-first methodology is based in the quenching of living polymers with multifunctional coupling agent or bifunctional vinyl compounds. Herein, we present the synthesis and characterization of a new star polymer, the multiarm star poly(2-hydroxyethyl methacrylate). The tetra-armed star polymer was prepared by reversible addition fragmentation chain-transfer (RAFT) polymerization using the core-first approach. The RAFT chain-transfer agent (RAFT CTA) pentaerythritol tetrakis[2-(dodecylthiocarbonothioylthio)-2-methylpropionate] was used as multiarm initiator core were 2-hydroxyethyl methacrylate (HEMA) was polymerized using AIBN as radical initiator. Structural characterization was performed by 1H NMR and FTIR. The new polymer is able to uptake large quantities of organic solvents, forming gels. The rheological behavior of these gels was also investigated.
Resumo:
Purpose – The purpose of this paper is to develop a subjective multidimensional measure of early career success during university-to-work transition. Design/methodology/approach – The construct of university-to-work success (UWS) was defined in terms of intrinsic and extrinsic career outcomes, and a three-stage study was conducted to create a new scale. Findings – A preliminary set of items was developed and tested by judges. Results showed the items had good content validity. Factor analyses indicated a four-factor structure and a second-order model with subscales to assess: career insertion and satisfaction, confidence in career future, income and financial independence, and adaptation to work. Third, the authors sought to confirm the hypothesized model examining the comparative fit of the scale and two alternative models. Results showed that fits for both the first- and second-order models were acceptable. Research limitations/implications – The proposed model has sound psychometric qualities, although the validated version of the scale was not able to incorporate all constructs envisaged by the initial theoretical model. Results indicated some direction for further refinement. Practical implications – The scale could be used as a tool for self-assessment or as an outcome measure to assess the efficacy of university-to-work programs in applied settings. Originality/value – This study provides a useful single measure to assess early career success during the university-to-work transition, and might facilitate testing of causal models which could help identify factors relevant for successful transition.
Resumo:
Does shareholder value orientation lead to shareholder value creation? This article proposes methods to quantify both, shareholder value orientation and shareholder value creation. Through the application of these models it is possible to quantify both dimensions and examine statistically in how far shareholder value orientation explains shareholder value creation. The scoring model developed in this paper allows quantifying the orientation of managers towards the objective to maximize wealth of shareholders. The method evaluates information that comes from the companies and scores the value orientation in a scale from 0 to 10 points. Analytically the variable value orientation is operationalized expressing it as the general attitude of managers toward the objective of value creation, investment policy and behavior, flexibility and further eight value drivers. The value creation model works with market data such as stock prices and dividend payments. Both methods where applied to a sample of 38 blue chip companies: 32 firms belonged to the share index IBEX 35 on July 1st, 1999, one company represents the “new economy” listed in the Spanish New Market as per July 1st, 2001, and 5 European multinational groups formed part of the EuroStoxx 50 index also on July 1st, 2001. The research period comprised the financial years 1998, 1999, and 2000. A regression analysis showed that between 15.9% and 23.4% of shareholder value creation can be explained by shareholder value orientation.
Resumo:
In this paper, 27 studies from the last decade which deal more or less explicitly with the International New Venture, global start-up or born-global phenomenon are first identified, and then fully examined and critically assessed as a basis for obtaining an adequate view of the state-of-the-art of this increasingly important research avenue in the field of International Entrepreneurship (IE). The methodology used for this synthetic review allow us to analyze a number of recent, purposefully-chosen studies that are systematically compared along the following criteria: 1) main objective and type of research; 2) theoretical framework/s of reference, 3) methodological issues, and 4) main findings and/or conclusions. As a result of this literature review, a critical assessment follows in which the most relevant benefits and contributions as well as potential drawbacks, limitations or major discrepancies in the research activities conducted so far are discussed. Finally, some suggestions and implications are provided in the form of future research directions.
Resumo:
This paper examines the extent to which Mexican emigrants to the United States are negatively selected, that is, have lower skills than individuals who remain in Mexico. Previous studies have been limited by the lack of nationally representative longitudinal data. This one uses a newly available household survey, which identifies emigrants before they leave and allows a direct comparison to non-migrants. I find that, on average, US bound Mexican emigrants from 2000 to 2004 earn a lower wage and have less schooling years than individuals who remain in Mexico, evidence of negative selection. This supports the original hypothesis of Borjas (AER, 1987) and argues against recent findings, notably those of Chiquiar and Hanson (JPE, 2005). The discrepancy with the latter is primarily due to an under-count of unskilled migrants in US sources and secondarily to the omission of unobservables in their methodology.
Resumo:
The level of information provided by ink evidence to the criminal and civil justice system is limited. The limitations arise from the weakness of the interpretative framework currently used, as proposed in the ASTM 1422-05 and 1789-04 on ink analysis. It is proposed to use the likelihood ratio from the Bayes theorem to interpret ink evidence. Unfortunately, when considering the analytical practices, as defined in the ASTM standards on ink analysis, it appears that current ink analytical practices do not allow for the level of reproducibility and accuracy required by a probabilistic framework. Such framework relies on the evaluation of the statistics of the ink characteristics using an ink reference database and the objective measurement of similarities between ink samples. A complete research programme was designed to (a) develop a standard methodology for analysing ink samples in a more reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in a forensic context. This report focuses on the first of the three stages. A calibration process, based on a standard dye ladder, is proposed to improve the reproducibility of ink analysis by HPTLC, when these inks are analysed at different times and/or by different examiners. The impact of this process on the variability between the repetitive analyses of ink samples in various conditions is studied. The results show significant improvements in the reproducibility of ink analysis compared to traditional calibration methods.