963 resultados para Applicant criterion
Resumo:
We study the functional specialization whereby some countries contribute relatively more inventors vs. organizations in the production of inventions at a global scale. We propose a conceptual framework to explain this type of functional specialization, which posits the presence of feedbacks between two distinct sub-systems, each one providing inventors and organizations. We quantify the phenomenon by means of a new metric, the “inventor balance”, which we compute using patent data. We show that the observed imbalances, which are often conspicuous, are determined by several factors: the innovativeness of a country relative to its level of economic development, relative factor endowments, the degree of technological specialization and, last, cultural traits. We argue that the “inventor balance” is a useful indicator for policy makers, and its routine analysis could lead to better informed innovation policies.
Resumo:
Shot peening is a cold-working mechanical process in which a shot stream is propelled against a component surface. Its purpose is to introduce compressive residual stresses on component surfaces for increasing the fatigue resistance. This process is widely applied in springs due to the cyclical loads requirements. This paper presents a numerical modelling of shot peening process using the finite element method. The results are compared with experimental measurements of the residual stresses, obtained by the X-rays diffraction technique, in leaf springs submitted to this process. Furthermore, the results are compared with empirical and numerical correlations developed by other authors.
Resumo:
The paper discusses the effect of stress triaxiality on the onset and evolution of damage in ductile metals. A series of tests including shear tests and experiments oil smooth and pre-notched tension specimens wits carried Out for it wide range of stress triaxialities. The underlying continuum damage model is based oil kinematic definition of damage tensors. The modular structure of the approach is accomplished by the decomposition of strain rates into elastic, plastic and damage parts. Free energy functions with respect to fictitious undamaged configurations as well as damaged ones are introduced separately leading to elastic material laws which are affected by increasing damage. In addition, a macroscopic yield condition and a flow rule are used to adequately describe the plastic behavior. Numerical simulations of the experiments are performed and good correlation of tests and numerical results is achieved. Based oil experimental and numerical data the damage criterion formulated in stress space is quantified. Different branches of this function are taken into account corresponding to different damage modes depending oil stress triaxiality and Lode parameter. In addition, identification of material parameters is discussed ill detail. (C) 2007 Elsevier Ltd. All rights reserved.
Resumo:
Establishing a few sites in which measurements of soil water storage (SWS) are time stable significantly reduces the efforts involved in determining average values of SWS. This study aimed to apply a new criterion the mean absolute bias error (MABE)-to identify temporally stable sites for mean SWS evaluation. The performance of MABE was compared with that of the commonly used criterion, the standard deviation of relative difference (SDRD). From October 2004 to October 2008, SWS of four soil layers (0-1.0, 1.0-2.0,2.0-3.0, and 3.0-4.0 m) was measured, using a neutron probe, at 28 sites on a hillslope of the Loess Plateau, China. A total of 37 SWS data sets taken over time were divided into two subsets, the first consisting of 22 dates collected during the calibration period from October 2004 to September 2006, and the second with 15 dates collected during the validation period from October 2006 to October 2008. The results showed that if a critical value of 5% for MABE was defined, more than half the sites were temporally stable for both periods, and the number of temporally stable sires generally increased with soil depth. Compared with SDRD, MABE was more suitable for the identification of time-stable sites for mean SS prediction. Since the absolute prediction error of drier sites is more sensitive to changes in relative difference in terms of mean SWS prediction, the sites of wet sectors should be preferable for mean SWS prediction for the same changes in relative difference.
Resumo:
Market-based transmission expansion planning gives information to investors on where is the most cost efficient place to invest and brings benefits to those who invest in this grid. However, both market issue and power system adequacy problems are system planers’ concern. In this paper, a hybrid probabilistic criterion of Expected Economical Loss (EEL) is proposed as an index to evaluate the systems’ overall expected economical losses during system operation in a competitive market. It stands on both investors’ and planner’s point of view and will further improves the traditional reliability cost. By applying EEL, it is possible for system planners to obtain a clear idea regarding the transmission network’s bottleneck and the amount of losses arises from this weak point. Sequentially, it enables planners to assess the worth of providing reliable services. Also, the EEL will contain valuable information for moneymen to undertake their investment. This index could truly reflect the random behaviors of power systems and uncertainties from electricity market. The performance of the EEL index is enhanced by applying Normalized Coefficient of Probability (NCP), so it can be utilized in large real power systems. A numerical example is carried out on IEEE Reliability Test System (RTS), which will show how the EEL can predict the current system bottleneck under future operational conditions and how to use EEL as one of planning objectives to determine future optimal plans. A well-known simulation method, Monte Carlo simulation, is employed to achieve the probabilistic characteristic of electricity market and Genetic Algorithms (GAs) is used as a multi-objective optimization tool.
Resumo:
Objective: To evaluate whether including children with onset of symptoms between ages 7 and 12 years in the ADHD diagnostic category would: (a) increase the prevalence of the disorder at age 12, and (b) change the clinical and cognitive features, impairment profile, and risk factors for ADHD compared with findings in the literature based on the DSM-IV definition of the disorder. Method: A birth cohort of 2,232 British children was prospectively evaluated at ages 7 and 12 years for ADHD using information from mothers and teachers. The prevalence of diagnosed ADHD at age 12 was evaluated with and without the inclusion of individuals who met DSM-IV age-of-onset criterion through mothers` or teachers` reports of symptoms at age 7. Children with onset of ADHD symptoms before versus after age 7 were compared on their clinical and cognitive features, impairment profile, and risk factors for ADHD. Results: Extending the age-of-onset criterion to age 12 resulted in a negligible increase in ADHD prevalence by age 12 years of 0.1%. Children who first manifested ADHD symptoms between ages 7 and 12 did not present correlates or risk factors that were significantly different from children who manifested symptoms before age 7. Conclusions: Results from this prospective birth cohort might suggest that adults who are able to report symptom onset by age 12 also had symptoms by age 7, even if they are not able to report them. The data suggest that the prevalence estimate, correlates and risk factors of ADHD will not be affected if the new diagnostic scheme extends the age-of-onset criterion to age 12. J. Am. Acad. Child Adolesc. Psychiatry, 2010;49(3):210-216.
Resumo:
Frame rate upconversion (FRUC) is an important post-processing technique to enhance the visual quality of low frame rate video. A major, recent advance in this area is FRUC based on trilateral filtering which novelty mainly derives from the combination of an edge-based motion estimation block matching criterion with the trilateral filter. However, there is still room for improvement, notably towards reducing the size of the uncovered regions in the initial estimated frame, this means the estimated frame before trilateral filtering. In this context, proposed is an improved motion estimation block matching criterion where a combined luminance and edge error metric is weighted according to the motion vector components, notably to regularise the motion field. Experimental results confirm that significant improvements are achieved for the final interpolated frames, reaching PSNR gains up to 2.73 dB, on average, regarding recent alternative solutions, for video content with varied motion characteristics.
Resumo:
Research on cluster analysis for categorical data continues to develop, new clustering algorithms being proposed. However, in this context, the determination of the number of clusters is rarely addressed. We propose a new approach in which clustering and the estimation of the number of clusters is done simultaneously for categorical data. We assume that the data originate from a finite mixture of multinomial distributions and use a minimum message length criterion (MML) to select the number of clusters (Wallace and Bolton, 1986). For this purpose, we implement an EM-type algorithm (Silvestre et al., 2008) based on the (Figueiredo and Jain, 2002) approach. The novelty of the approach rests on the integration of the model estimation and selection of the number of clusters in a single algorithm, rather than selecting this number based on a set of pre-estimated candidate models. The performance of our approach is compared with the use of Bayesian Information Criterion (BIC) (Schwarz, 1978) and Integrated Completed Likelihood (ICL) (Biernacki et al., 2000) using synthetic data. The obtained results illustrate the capacity of the proposed algorithm to attain the true number of cluster while outperforming BIC and ICL since it is faster, which is especially relevant when dealing with large data sets.
Resumo:
We assess the performance of Gaussianity tests, namely the Anscombe-Glynn, Lilliefors, Cramér-von Mises, and Giannakis-Tsatsanis (G-T), with the purpose of detecting narrowband and wideband interference in GNSS signals. Simulations have shown that the G-T test outperforms the others being suitable as a benchmark for comparison with different types of interference detection algorithms. © 2014 EURASIP.
Resumo:
Based on a literature review, this article frames different stages of the foster care process, identifying a set of standardized measures in the American and Portuguese contexts which, if implemented, could contribute towards higher levels of foster success. The article continues with the presentation of a comparative study, based on the application of the Casey Foster Applicant Inventory-Applicant Version (CFAI-A) questionnaire, in the aforementioned contexts. Taking a comparative analyses of CFAI-A's psychometric characteristics in four different samples as a starting point, one discovered that despite the fact that the questionnaire was adapted to Portuguese reality, it kept the quality values presented on the American samples. It specifically shows significant values regarding reliability and validity. This questionnaire, which aims to assess the potential of foster families, also supports the technical staff's decision making process regarding the monitoring and support of foster families, while it also promotes a better decision in the placement process towards the child's integration and development.
Resumo:
Risk management is of paramount importance in the success of tunnelling works and is linked to the tunnelling method and to the constraints of the works. Sequencial Excavation Method (SEM) and Tun-nel Boring Machine (TBM) method have been competing for years. This article, part of a wider study on the influence of the â Safety and Healthâ criterion in the choice of method, reviews the existing literature about the criteria usually employed to choose the tunnelling method and on the criterion â Safety and Healthâ . This crite-rion is particularly important, due to the financial impacts of work accidents and occupational diseases. This article is especially useful to the scientific and technical community, since it synthesizes the relevance of each one of the choice criteria used and it shows why â Safety and Healthâ must be a criterion in the decision mak-ing process to choose the tunnelling method.
Resumo:
We prove a criterion for the irreducibility of an integral group representation p over the fraction field of a noetherian domain R in terms of suitably defined reductions of p at prime ideals of R. As applications, we give irreducibility results for universal deformations of residual representations, with a special attention to universal deformations of residual Galois representations associated with modular forms of weight at least 2.
Resumo:
This study presents a classification criteria for two-class Cannabis seedlings. As the cultivation of drug type cannabis is forbidden in Switzerland, law enforcement authorities regularly ask laboratories to determine cannabis plant's chemotype from seized material in order to ascertain that the plantation is legal or not. In this study, the classification analysis is based on data obtained from the relative proportion of three major leaf compounds measured by gas-chromatography interfaced with mass spectrometry (GC-MS). The aim is to discriminate between drug type (illegal) and fiber type (legal) cannabis at an early stage of the growth. A Bayesian procedure is proposed: a Bayes factor is computed and classification is performed on the basis of the decision maker specifications (i.e. prior probability distributions on cannabis type and consequences of classification measured by losses). Classification rates are computed with two statistical models and results are compared. Sensitivity analysis is then performed to analyze the robustness of classification criteria.
Resumo:
In their safety evaluations of bisphenol A (BPA), the U.S. Food and Drug Administration (FDA) and a counterpart in Europe, the European Food Safety Authority (EFSA), have given special prominence to two industry-funded studies that adhered to standards defined by Good Laboratory Practices (GLP). These same agencies have given much less weight in risk assessments to a large number of independently replicated non-GLP studies conducted with government funding by the leading experts in various fields of science from around the world. OBJECTIVES: We reviewed differences between industry-funded GLP studies of BPA conducted by commercial laboratories for regulatory purposes and non-GLP studies conducted in academic and government laboratories to identify hazards and molecular mechanisms mediating adverse effects. We examined the methods and results in the GLP studies that were pivotal in the draft decision of the U.S. FDA declaring BPA safe in relation to findings from studies that were competitive for U.S. National Institutes of Health (NIH) funding, peer-reviewed for publication in leading journals, subject to independent replication, but rejected by the U.S. FDA for regulatory purposes. DISCUSSION: Although the U.S. FDA and EFSA have deemed two industry-funded GLP studies of BPA to be superior to hundreds of studies funded by the U.S. NIH and NIH counterparts in other countries, the GLP studies on which the agencies based their decisions have serious conceptual and methodologic flaws. In addition, the U.S. FDA and EFSA have mistakenly assumed that GLP yields valid and reliable scientific findings (i.e., "good science"). Their rationale for favoring GLP studies over hundreds of publically funded studies ignores the central factor in determining the reliability and validity of scientific findings, namely, independent replication, and use of the most appropriate and sensitive state-of-the-art assays, neither of which is an expectation of industry-funded GLP research. CONCLUSIONS: Public health decisions should be based on studies using appropriate protocols with appropriate controls and the most sensitive assays, not GLP. Relevant NIH-funded research using state-of-the-art techniques should play a prominent role in safety evaluations of chemicals.