926 resultados para automatic affect analysis
Resumo:
Plant growth analysis presents difficulties related to statistical comparison of growth rates, and the analysis of variance of primary data could guide the interpretation of results. The objective of this work was to evaluate the analysis of variance of data from distinct harvests of an experiment, focusing especially on the homogeneity of variances and the choice of an adequate ANOVA model. Data from five experiments covering different crops and growth conditions were used. From the total number of variables, 19% were originally homoscedastic, 60% became homoscedastic after logarithmic transformation, and 21% remained heteroscedastic after transformation. Data transformation did not affect the F test in one experiment, whereas in the other experiments transformation modified the F test usually reducing the number of significant effects. Even when transformation has not altered the F test, mean comparisons led to divergent interpretations. The mixed ANOVA model, considering harvest as a random effect, reduced the number of significant effects of every factor which had the F test modified by this model. Examples illustrated that analysis of variance of primary variables provides a tool for identifying significant differences in growth rates. The analysis of variance imposes restrictions to experimental design thereby eliminating some advantages of the functional growth analysis.
Resumo:
Drug safety issues pose serious health threats to the population and constitute a major cause of mortality worldwide. Due to the prominent implications to both public health and the pharmaceutical industry, it is of great importance to unravel the molecular mechanisms by which an adverse drug reaction can be potentially elicited. These mechanisms can be investigated by placing the pharmaco-epidemiologically detected adverse drug reaction in an information-rich context and by exploiting all currently available biomedical knowledge to substantiate it. We present a computational framework for the biological annotation of potential adverse drug reactions. First, the proposed framework investigates previous evidences on the drug-event association in the context of biomedical literature (signal filtering). Then, it seeks to provide a biological explanation (signal substantiation) by exploring mechanistic connections that might explain why a drug produces a specific adverse reaction. The mechanistic connections include the activity of the drug, related compounds and drug metabolites on protein targets, the association of protein targets to clinical events, and the annotation of proteins (both protein targets and proteins associated with clinical events) to biological pathways. Hence, the workflows for signal filtering and substantiation integrate modules for literature and database mining, in silico drug-target profiling, and analyses based on gene-disease networks and biological pathways. Application examples of these workflows carried out on selected cases of drug safety signals are discussed. The methodology and workflows presented offer a novel approach to explore the molecular mechanisms underlying adverse drug reactions
Resumo:
“The liquidity crisis of the Spanish banks is largely due to the lack of confidence of foreign investors and, therefore, the changes that occur in the legislation should not affect the credibility, stability, legal certainty, predictability that markets expect”.Sergio Nasarre (2011)In the current situation of economic crisis, many people have found they can no longer pay back the mortgage loans that were granted to them in order to purchase a dwelling. It is for this reason that, in light of the economic, political and social problems this poses, our paper studies the state of the Spanish real-estate system and of foreclosure, paying special attention to the solution that has been proposed recently as the best option for debtors that cannot make their mort-gage payments: non-recourse mortgaging. We analyze this proposal from legal and economic perspectives in order to fully understand the effects that this change could imply. At the same time, this paper will also examine several alternatives we believe would ameliorate the situation of mortgage-holders, among them legal reforms, mortgage insurance, and non-recourse mortgaging itself.
Resumo:
In this work we present a simulation of a recognition process with perimeter characterization of a simple plant leaves as a unique discriminating parameter. Data coding allowing for independence of leaves size and orientation may penalize performance recognition for some varieties. Border description sequences are then used, and Principal Component Analysis (PCA) is applied in order to study which is the best number of components for the classification task, implemented by means of a Support Vector Machine (SVM) System. Obtained results are satisfactory, and compared with [4] our system improves the recognition success, diminishing the variance at the same time.
Resumo:
In this work we present a simulation of a recognition process with perimeter characterization of a simple plant leaves as a unique discriminating parameter. Data coding allowing for independence of leaves size and orientation may penalize performance recognition for some varieties. Border description sequences are then used to characterize the leaves. Independent Component Analysis (ICA) is then applied in order to study which is the best number of components to be considered for the classification task, implemented by means of an Artificial Neural Network (ANN). Obtained results with ICA as a pre-processing tool are satisfactory, and compared with some references our system improves the recognition success up to 80.8% depending on the number of considered independent components.
Resumo:
Although paraphrasing is the linguistic mechanism underlying many plagiarism cases, little attention has been paid to its analysis in the framework of automatic plagiarism detection. Therefore, state-of-the-art plagiarism detectors find it difficult to detect cases of paraphrase plagiarism. In this article, we analyse the relationship between paraphrasing and plagiarism, paying special attention to which paraphrase phenomena underlie acts of plagiarism and which of them are detected by plagiarism detection systems. With this aim in mind, we created the P4P corpus, a new resource which uses a paraphrase typology to annotate a subset of the PAN-PC-10 corpus for automatic plagiarism detection. The results of the Second International Competition on Plagiarism Detection were analysed in the light of this annotation. The presented experiments show that (i) more complex paraphrase phenomena and a high density of paraphrase mechanisms make plagiarism detection more difficult, (ii) lexical substitutions are the paraphrase mechanisms used the most when plagiarising, and (iii) paraphrase mechanisms tend to shorten the plagiarized text. For the first time, the paraphrase mechanisms behind plagiarism have been analysed, providing critical insights for the improvement of automatic plagiarism detection systems.
Resumo:
BACKGROUND: Findings from randomised trials have shown a higher early risk of stroke after carotid artery stenting than after carotid endarterectomy. We assessed whether white-matter lesions affect the perioperative risk of stroke in patients treated with carotid artery stenting versus carotid endarterectomy. METHODS: Patients with symptomatic carotid artery stenosis included in the International Carotid Stenting Study (ICSS) were randomly allocated to receive carotid artery stenting or carotid endarterectomy. Copies of baseline brain imaging were analysed by two investigators, who were masked to treatment, for the severity of white-matter lesions using the age-related white-matter changes (ARWMC) score. Randomisation was done with a computer-generated sequence (1:1). Patients were divided into two groups using the median ARWMC. We analysed the risk of stroke within 30 days of revascularisation using a per-protocol analysis. ICSS is registered with controlled-trials.com, number ISRCTN 25337470. FINDINGS: 1036 patients (536 randomly allocated to carotid artery stenting, 500 to carotid endarterectomy) had baseline imaging available. Median ARWMC score was 7, and patients were dichotomised into those with a score of 7 or more and those with a score of less than 7. In patients treated with carotid artery stenting, those with an ARWMC score of 7 or more had an increased risk of stroke compared with those with a score of less than 7 (HR for any stroke 2·76, 95% CI 1·17-6·51; p=0·021; HR for non-disabling stroke 3·00, 1·10-8·36; p=0·031), but we did not see a similar association in patients treated with carotid endarterectomy (HR for any stroke 1·18, 0·40-3·55; p=0·76; HR for disabling or fatal stroke 1·41, 0·38-5·26; p=0·607). Carotid artery stenting was associated with a higher risk of stroke compared with carotid endarterectomy in patients with an ARWMC score of 7 or more (HR for any stroke 2·98, 1·29-6·93; p=0·011; HR for non-disabling stroke 6·34, 1·45-27·71; p=0·014), but there was no risk difference in patients with an ARWMC score of less than 7. INTERPRETATION: The presence of white-matter lesions on brain imaging should be taken into account when selecting patients for carotid revascularisation. Carotid artery stenting should be avoided in patients with more extensive white-matter lesions, but might be an acceptable alternative to carotid endarterectomy in patients with less extensive lesions. FUNDING: Medical Research Council, the Stroke Association, Sanofi-Synthélabo, the European Union Research Framework Programme 5.
Resumo:
In the administration, planning, design, and maintenance of road systems, transportation professionals often need to choose between alternatives, justify decisions, evaluate tradeoffs, determine how much to spend, set priorities, assess how well the network meets traveler needs, and communicate the basis for their actions to others. A variety of technical guidelines, tools, and methods have been developed to help with these activities. Such work aids include design criteria guidelines, design exception analysis methods, needs studies, revenue allocation schemes, regional planning guides, designation of minimum standards, sufficiency ratings, management systems, point based systems to determine eligibility for paving, functional classification, and bridge ratings. While such tools play valuable roles, they also manifest a number of deficiencies and are poorly integrated. Design guides tell what solutions MAY be used, they aren't oriented towards helping find which one SHOULD be used. Design exception methods help justify deviation from design guide requirements but omit consideration of important factors. Resource distribution is too often based on dividing up what's available rather than helping determine how much should be spent. Point systems serve well as procedural tools but are employed primarily to justify decisions that have already been made. In addition, the tools aren't very scalable: a system level method of analysis seldom works at the project level and vice versa. In conjunction with the issues cited above, the operation and financing of the road and highway system is often the subject of criticisms that raise fundamental questions: What is the best way to determine how much money should be spent on a city or a county's road network? Is the size and quality of the rural road system appropriate? Is too much or too little money spent on road work? What parts of the system should be upgraded and in what sequence? Do truckers receive a hidden subsidy from other motorists? Do transportation professions evaluate road situations from too narrow of a perspective? In considering the issues and questions the author concluded that it would be of value if one could identify and develop a new method that would overcome the shortcomings of existing methods, be scalable, be capable of being understood by the general public, and utilize a broad viewpoint. After trying out a number of concepts, it appeared that a good approach would be to view the road network as a sub-component of a much larger system that also includes vehicles, people, goods-in-transit, and all the ancillary items needed to make the system function. Highway investment decisions could then be made on the basis of how they affect the total cost of operating the total system. A concept, named the "Total Cost of Transportation" method, was then developed and tested. The concept rests on four key principles: 1) that roads are but one sub-system of a much larger 'Road Based Transportation System', 2) that the size and activity level of the overall system are determined by market forces, 3) that the sum of everything expended, consumed, given up, or permanently reserved in building the system and generating the activity that results from the market forces represents the total cost of transportation, and 4) that the economic purpose of making road improvements is to minimize that total cost. To test the practical value of the theory, a special database and spreadsheet model of Iowa's county road network was developed. This involved creating a physical model to represent the size, characteristics, activity levels, and the rates at which the activities take place, developing a companion economic cost model, then using the two in tandem to explore a variety of issues. Ultimately, the theory and model proved capable of being used in full system, partial system, single segment, project, and general design guide levels of analysis. The method appeared to be capable of remedying many of the existing work method defects and to answer society's transportation questions from a new perspective.
Resumo:
A plant species' genetic population structure is the result of a complex combination of its life history, ecological preferences, position in the ecosystem and historical factors. As a result, many different statistical methods exist that measure different aspects of species' genetic structure. However, little is known about how these methods are interrelated and how they are related to a species' ecology and life history. In this study, we used the IntraBioDiv amplified fragment length polymorphisms data set from 27 high-alpine species to calculate eight genetic summary statistics that we jointly correlate to a set of six ecological and life-history traits. We found that there is a large amount of redundancy among the calculated summary statistics and that there is a significant association with the matrix of species traits. In a multivariate analysis, two main aspects of population structure were visible among the 27 species. The first aspect is related to the species' dispersal capacities and the second is most likely related to the species' postglacial recolonization of the Alps. Furthermore, we found that some summary statistics, most importantly Mantel's r and Jost's D, show different behaviour than expected based on theory. We therefore advise caution in drawing too strong conclusions from these statistics.
Resumo:
Using Swiss data from the 2003 International Social Survey Programme (N = 902), this multilevel study combined individual and municipality levels of analysis in the explanation of nationalism, patriotism and exclusionary immigration attitudes. On the individual level, the results show that in line with previous research nationalism (uncritical and blind attachment to the nation) increased exclusionary immigration attitudes, while patriotism (pride in national democratic institutions) was related to greater tolerance towards immigration. On the municipality level, urbanization, socioeconomic status and immigrant proportion (and their interaction effects) were found to affect nationalism, patriotism and immigration attitudes. Nationalist and patriotic forms of national attachment were stronger in German-speaking municipalities than in the French-speaking municipalities. Path analyses further revealed that living in a Swiss-German municipality indirectly led to more negative immigration attitudes through an increase in nationalism. The research is discussed in light of social psychological and political science literature on political attitudes.
Resumo:
This paper analyses the effect of R&D investment on firm growth. We use an extensive sample of Spanish manufacturing and service firms. The database comprises diverse waves of Spanish Community Innovation Survey and covers the period 2004–2008. First, a probit model corrected for sample selection analyses the role of innovation on the probability of being a high-growth firm (HGF). Second, a quantile regression technique is applied to explore the determinants of firm growth. Our database shows that a small number of firms experience fast growth rates in terms of sales or employees. Our results reveal that R&D investments positively affect the probability of becoming a HGF. However, differences appear between manufacturing and service firms. Finally, when we study the impact of R&D investment on firm growth, quantile estimations show that internal R&D presents a significant positive impact for the upper quantiles, while external R&D shows a significant positive impact up to the median. Keywords : High-growth firms, Firm growth, Innovation activity. JEL Classifications : L11, L25, L26, O30
Resumo:
The most adequate approach for benchmarking web accessibility is manual expert evaluation supplemented by automatic analysis tools. But manual evaluation has a high cost and is impractical to be applied on large websites. In reality, there is no choice but to rely on automated tools when reviewing large web sites for accessibility. The question is: to what extent the results from automatic evaluation of a web site and individual web pages can be used as an approximation for manual results? This paper presents the initial results of an investigation aimed at answering this question. He have performed both manual and automatic evaluations of the accessibility of web pages of two sites and we have compared the results. In our data set automatically retrieved results could most definitely be used as an approximation manual evaluation results.
Resumo:
The paper deals with the development and application of the methodology for automatic mapping of pollution/contamination data. General Regression Neural Network (GRNN) is considered in detail and is proposed as an efficient tool to solve this problem. The automatic tuning of isotropic and an anisotropic GRNN model using cross-validation procedure is presented. Results are compared with k-nearest-neighbours interpolation algorithm using independent validation data set. Quality of mapping is controlled by the analysis of raw data and the residuals using variography. Maps of probabilities of exceeding a given decision level and ?thick? isoline visualization of the uncertainties are presented as examples of decision-oriented mapping. Real case study is based on mapping of radioactively contaminated territories.
Resumo:
Granular flow phenomena are frequently encountered in the design of process and industrial plants in the traditional fields of the chemical, nuclear and oil industries as well as in other activities such as food and materials handling. Multi-phase flow is one important branch of the granular flow. Granular materials have unusual kinds of behavior compared to normal materials, either solids or fluids. Although some of the characteristics are still not well-known yet, one thing is confirmed: the particle-particle interaction plays a key role in the dynamics of granular materials, especially for dense granular materials. At the beginning of this thesis, detailed illustration of developing two models for describing the interaction based on the results of finite-element simulation, dimension analysis and numerical simulation is presented. The first model is used to describing the normal collision of viscoelastic particles. Based on some existent models, more parameters are added to this model, which make the model predict the experimental results more accurately. The second model is used for oblique collision, which include the effects from tangential velocity, angular velocity and surface friction based on Coulomb's law. The theoretical predictions of this model are in agreement with those by finite-element simulation. I n the latter chapters of this thesis, the models are used to predict industrial granular flow and the agreement between the simulations and experiments also shows the validation of the new model. The first case presents the simulation of granular flow passing over a circular obstacle. The simulations successfully predict the existence of a parabolic steady layer and show how the characteristics of the particles, such as coefficients of restitution and surface friction affect the separation results. The second case is a spinning container filled with granular material. Employing the previous models, the simulation could also reproduce experimentally observed phenomena, such as a depression in the center of a high frequency rotation. The third application is about gas-solid mixed flow in a vertically vibrated device. Gas phase motion is added to coherence with the particle motion. The governing equations of the gas phase are solved by using the Large eddy simulation (LES) and particle motion is predicted by using the Lagrangian method. The simulation predicted some pattern formation reported by experiment.
Resumo:
Centrifugal compressors are widely used for example in process industry, oil and gas industry, in small gas turbines and turbochargers. In order to achieve lower consumption of energy and operation costs the efficiency of the compressor needs to be improve. In the present work different pinches and low solidity vaned diffusers were utilized in order to improve the efficiency of a medium size centrifugal compressor. In this study, pinch means the decrement of the diffuser flow passage height. First different geometries were analyzed using computational fluid dynamics. The flow solver Finflo was used to solve the flow field. Finflo is a Navier-Stokes solver. The solver is capable to solve compressible, incompressible, steady and unsteady flow fields. Chien's k-e turbulence model was used. One of the numerically investigated pinched diffuser and one low solidity vaned diffuser were studied experimentally. The overall performance of the compressor and the static pressure distribution before and after the diffuser were measured. The flow entering and leaving the diffuser was measured using a three-hole Cobra-probe and Kiel-probes. The pinch and the low solidity vaned diffuser increased the efficiency of the compressor. Highest isentropic efficiency increment obtained was 3\% of the design isentropic efficiency of the original geometry. It was noticed in the numerical results that the pinch made to the hub and the shroud wall was most beneficial to the operation of the compressor. Also the pinch made to the hub was better than the pinchmade to the shroud. The pinch did not affect the operation range of the compressor, but the low solidity vaned diffuser slightly decreased the operation range.The unsteady phenomena in the vaneless diffuser were studied experimentally andnumerically. The unsteady static pressure was measured at the diffuser inlet and outlet, and time-accurate numerical simulation was conducted. The unsteady static pressure showed that most of the pressure variations lay at the passing frequency of every second blade. The pressure variations did not vanish in the diffuser and were visible at the diffuser outlet. However, the amplitude of the pressure variations decreased in the diffuser. The time-accurate calculations showed quite a good agreement with the measured data. Agreement was very good at the design operation point, even though the computational grid was not dense enough inthe volute and in the exit cone. The time-accurate calculation over-predicted the amplitude of the pressure variations at high flow.