950 resultados para Mathematical and statistical techniques


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This Licentiate Thesis is devoted to the presentation and discussion of some new contributions in applied mathematics directed towards scientific computing in sports engineering. It considers inverse problems of biomechanical simulations with rigid body musculoskeletal systems especially in cross-country skiing. This is a contrast to the main research on cross-country skiing biomechanics, which is based mainly on experimental testing alone. The thesis consists of an introduction and five papers. The introduction motivates the context of the papers and puts them into a more general framework. Two papers (D and E) consider studies of real questions in cross-country skiing, which are modelled and simulated. The results give some interesting indications, concerning these challenging questions, which can be used as a basis for further research. However, the measurements are not accurate enough to give the final answers. Paper C is a simulation study which is more extensive than paper D and E, and is compared to electromyography measurements in the literature. Validation in biomechanical simulations is difficult and reducing mathematical errors is one way of reaching closer to more realistic results. Paper A examines well-posedness for forward dynamics with full muscle dynamics. Moreover, paper B is a technical report which describes the problem formulation and mathematical models and simulation from paper A in more detail. Our new modelling together with the simulations enable new possibilities. This is similar to simulations of applications in other engineering fields, and need in the same way be handled with care in order to achieve reliable results. The results in this thesis indicate that it can be very useful to use mathematical modelling and numerical simulations when describing cross-country skiing biomechanics. Hence, this thesis contributes to the possibility of beginning to use and develop such modelling and simulation techniques also in this context.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Adaptability and invisibility are hallmarks of modern terrorism, and keeping pace with its dynamic nature presents a serious challenge for societies throughout the world. Innovations in computer science have incorporated applied mathematics to develop a wide array of predictive models to support the variety of approaches to counterterrorism. Predictive models are usually designed to forecast the location of attacks. Although this may protect individual structures or locations, it does not reduce the threat—it merely changes the target. While predictive models dedicated to events or social relationships receive much attention where the mathematical and social science communities intersect, models dedicated to terrorist locations such as safe-houses (rather than their targets or training sites) are rare and possibly nonexistent. At the time of this research, there were no publically available models designed to predict locations where violent extremists are likely to reside. This research uses France as a case study to present a complex systems model that incorporates multiple quantitative, qualitative and geospatial variables that differ in terms of scale, weight, and type. Though many of these variables are recognized by specialists in security studies, there remains controversy with respect to their relative importance, degree of interaction, and interdependence. Additionally, some of the variables proposed in this research are not generally recognized as drivers, yet they warrant examination based on their potential role within a complex system. This research tested multiple regression models and determined that geographically-weighted regression analysis produced the most accurate result to accommodate non-stationary coefficient behavior, demonstrating that geographic variables are critical to understanding and predicting the phenomenon of terrorism. This dissertation presents a flexible prototypical model that can be refined and applied to other regions to inform stakeholders such as policy-makers and law enforcement in their efforts to improve national security and enhance quality-of-life.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Long-term monitoring of acoustical environments is gaining popularity thanks to the relevant amount of scientific and engineering insights that it provides. The increasing interest is due to the constant growth of storage capacity and computational power to process large amounts of data. In this perspective, machine learning (ML) provides a broad family of data-driven statistical techniques to deal with large databases. Nowadays, the conventional praxis of sound level meter measurements limits the global description of a sound scene to an energetic point of view. The equivalent continuous level Leq represents the main metric to define an acoustic environment, indeed. Finer analyses involve the use of statistical levels. However, acoustic percentiles are based on temporal assumptions, which are not always reliable. A statistical approach, based on the study of the occurrences of sound pressure levels, would bring a different perspective to the analysis of long-term monitoring. Depicting a sound scene through the most probable sound pressure level, rather than portions of energy, brought more specific information about the activity carried out during the measurements. The statistical mode of the occurrences can capture typical behaviors of specific kinds of sound sources. The present work aims to propose an ML-based method to identify, separate and measure coexisting sound sources in real-world scenarios. It is based on long-term monitoring and is addressed to acousticians focused on the analysis of environmental noise in manifold contexts. The presented method is based on clustering analysis. Two algorithms, Gaussian Mixture Model and K-means clustering, represent the main core of a process to investigate different active spaces monitored through sound level meters. The procedure has been applied in two different contexts: university lecture halls and offices. The proposed method shows robust and reliable results in describing the acoustic scenario and it could represent an important analytical tool for acousticians.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The importance of medicinal plants and their use in industrial applications is increasing worldwide, especially in Brazil. Phyllanthus species, popularly known as quebra-pedras in Brazil, are used in folk medicine for treating urinary infections and renal calculus. This paper reports an authenticity study, based on herbal drugs from Phyllanthus species, involving commercial and authentic samples using spectroscopic techniques: FT-IR, ¹H HR-MAS NMR and ¹H NMR in solution, combined with chemometric analysis. The spectroscopic techniques evaluated, coupled with chemometric methods, have great potential in the investigation of complex matrices. Furthermore, several metabolites were identified by the NMR techniques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Currently, the acoustic and nanoindentation techniques are two of the most used techniques for material elastic modulus measurement. In this article fundamental principles and limitations of both techniques are shown and discussed. Last advances in nanoindentation technique are also reviewed. An experimental study in ceramic, metallic, composite and single crystals was also done. Results shown that ultrasonic technique is capable to provide results in agreement with those reported in literature. However, ultrasonic technique does not allow measuring the elastic modulus of some small samples and single crystals. On the other hand, the nanoindentation technique estimates the elastic modulus values in reasonable agreement with those measured by acoustic methods, particularly in amorphous materials, while in some policristaline materials some deviation from expected values was obtained.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this study, 73 South American red wines (Vitis vinifera) from 5 varietals were classified based on sensory quality, retail price and antioxidant activity and characterised in relation to their phenolic composition. ORAC and DPPH assays were assessed to determine the antioxidant activity, and sensory analysis was conducted by seven professional tasters using the Wine Spirits Education Trust`s structured scales. The use of multivariate statistical techniques allowed the identification of wines with the best combination of sensory characteristics, price and antioxidant activity. The most favourable varieties were Malbec, Cabernet Sauvignon, and Syrah produced in Chile and Argentina. Conversely, Pinot Noir wines displayed the lowest sensory characteristics and antioxidant activity. These results suggest that the volatile compounds may be the main substances responsible for differentiating red wines on the basis of sensory evaluation. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The antioxidant activity of natural and synthetic compounds was evaluated using five in vitro methods: ferric reducing/antioxidant power (FRAP), 2,2-diphenyl-1-picrylhydradzyl (DPPH), oxygen radical absorption capacity (ORAL), oxidation of an aqueous dispersion of linoleic acid accelerated by azo-initiators (LAOX), and oxidation of a meat homogenate submitted to a thermal treatment (TBARS). All results were expressed as Trolox equivalents. The application of multivariate statistical techniques suggested that the phenolic compounds (caffeic acid, carnosic acid, genistein and resveratrol), beyond their high antioxidant activity measured by the DPPH, FRAP and TBARS methods, showed the highest ability to react with the radicals in the ORAC methodology, compared to the other compounds evaluated in this study (ascorbic acid, erythorbate, tocopherol, BHT, Trolox, tryptophan, citric acid, EDTA, glutathione, lecithin, methionine and tyrosine). This property was significantly correlated with the number of phenolic rings and catecholic structure present in the molecule. Based on a multivariate analysis, it is possible to select compounds from different clusters and explore their antioxidant activity interactions in food products.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Discussion opposing the Theory of the Firm to the Theory of Stakeholders are contemporaneous and polemical. One focal point of such debates refers to which objective-function companies, should choose, whether that of the shareholders or that of the stakeholders, and whether it is possible to opt for both simultaneously. Several empirical studies. have attempted-to test a possible correlation between both functions, and there has not been any consensus-so far. The objective of the present research is to examine a gap in such discussions: is there (or not) a subordination of the stakeholders` objective-function to that of the shareholders? The research is empirical,and analytical and employs quantitative methods. Hypotheses were tested and data analyzed by using non-parametrical (chi-square test) and parametrical procedures (frequency. correlation `coefficient). Secondary data was collected from he Economitica database and from the Brazilian Institute of Social and-Economic Analyses (IBASE) website, relative to public companies that have published their Social Balance Statements following the IBASE model from 1999 to 2006, whose sample amounted to 65 companies; In order to assess the objective-function of shareholders a proxy was created based on the following three indices: ROE (return on equity), EnterpriseValue and Tobin`s Q. In order to assess the objective-function of stakeholders a proxy was created by employing the following IBASE social balance indices: internal ones (ISI), external ones (ISE), and environmental ones (IAM). The results have shown no evidence of subordination of stakeholders` objective-function to that of the shareholders in analyzed companies, negating initial expectations and calling for deeper investigation of results. Its main conclusion, which states that the attempted subordination does not take place, is limited to the sample herein investigated and calls for ongoing research aiming at improvements which may lead to sample enlargement and, as a consequence, may make feasible the application of other statistical techniques which may yield a more thorough, analysis of the studied phenomehon.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Real time three-dimensional echocardiography (RT3DE) has been demonstrated to be an accurate technique to quantify left ventricular (LV) volumes and function in different patient populations. We sought to determine the value of RT3DE for evaluating patients with hypertrophic cardiomyopathy (HCM), in comparison with cardiac magnetic resonance imaging (MRI). Methods: We studied 20 consecutive patients with HCM who underwent two-dimensional echocardiography (2DE), RT3DE, and MRI. Parameters analyzed by echocardiography and MRI included: wall thickness, LV volumes, ejection fraction (LVEF), mass, geometric index, and dyssynchrony index. Statistical analysis was performed by Lin agreement coefficient, Pearson linear correlation and Bland-Altman model. Results: There was excellent agreement between 2DE and RT3DE (Rc = 0.92), 2DE and MRI (Rc = 0.85), and RT3DE and MRI (Rc = 0.90) for linear measurements. Agreement indexes for LV end-diastolic and end-systolic volumes were Rc = 0.91 and Rc = 0.91 between 2DE and RT3DE, Rc = 0.94 and Rc = 0.95 between RT3DE and MRI, and Rc = 0.89 and Rc = 0.88 between 2DE and MRI, respectively. Satisfactory agreement was observed between 2DE and RT3DE (Rc = 0.75), RT3DE and MRI (Rc = 0.83), and 2DE and MRI (Rc = 0.73) for determining LVEF, with a mild underestimation of LVEF by 2DE, and smaller variability between RT3DE and MRI. Regarding LV mass, excellent agreement was observed between RT3DE and MRI (Rc = 0.96), with bias of -6.3 g (limits of concordance = 42.22 to -54.73 g). Conclusion: In patients with HCM, RT3DE demonstrated superior performance than 2DE for the evaluation of myocardial hypertrophy, LV volumes, LVEF, and LV mass.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective-To compare the accuracy and feasibility of harmonic power Doppler and digitally subtracted colour coded grey scale imaging for the assessment of perfusion defect severity by single photon emission computed tomography (SPECT) in an unselected group of patients. Design-Cohort study. Setting-Regional cardiothoracic unit. Patients-49 patients (mean (SD) age 61 (11) years; 27 women, 22 men) with known or suspected coronary artery disease were studied with simultaneous myocardial contrast echo (MCE) and SPECT after standard dipyridamole stress. Main outcome measures-Regional myocardial perfusion by SPECT, performed with Tc-99m tetrafosmin, scored qualitatively and also quantitated as per cent maximum activity. Results-Normal perfusion was identified by SPECT in 225 of 270 segments (83%). Contrast echo images were interpretable in 92% of patients. The proportion of normal MCE by grey scale, subtracted, and power Doppler techniques were respectively 76%, 74%, and 88% (p < 0.05) at > 80% of maximum counts, compared with 65%, 69%, and 61% at < 60% of maximum counts. For each technique, specificity was lowest in the lateral wail, although power Doppler was the least affected. Grey scale and subtraction techniques were least accurate in the septal wall, but power Doppler showed particular problems in the apex. On a per patient analysis, the sensitivity was 67%, 75%, and 83% for detection of coronary artery disease using grey scale, colour coded, and power Doppler, respectively, with a significant difference between power Doppler and grey scale only (p < 0.05). Specificity was also the highest for power Doppler, at 55%, but not significantly different from subtracted colour coded images. Conclusions-Myocardial contrast echo using harmonic power Doppler has greater accuracy than with grey scale imaging and digital subtraction. However, power Doppler appears to be less sensitive for mild perfusion defects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Applying programming techniques to detailed data for 406 rice farms in 21 villages, for 1997, produces inefficiency measures, which differ substantially from the results of simple yield and unit cost measures. For the Boro (dry) season, mean technical efficiency was efficiency was 56.2 per cent and 69.4 per cent, allocative efficiency was 81.3 per cent, cost efficiency was 56.2 per cent and scale efficiency 94.9 per cent. The Aman (wet) season results are similar, but a few points lower. Allocative inefficiency is due to overuse of labour, suggesting population pressure, and of fertiliser, where recommended rates may warrant revision. Second-stage regressions show that large families are more inefficient, whereas farmers with better access to input markets, and those who do less off-farm work, tend to be more efficient. The information on the sources of inter-farm performance differentials could be used by the extension agents to help inefficient farmers. There is little excuse for such sub-optimal use of survey data, which are often collected at substantial costs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Estimates of the performance of carbohydrate deficient transferrin (CDT) and gamma glutamyltransferase (GGT) as markers of alcohol consumption have varied widely. Studies have differed in design and subject characteristics. The WHO/ISBRA Collaborative Study allows assessment and comparison of CDT, GGT, and aspartate aminotransferase (AST) as markers of drinking in a large, well-characterized, multicenter sample. Methods: A total of 1863 subjects were recruited from five countries (Australia, Brazil, Canada, Finland, and Japan). Recruitment was stratified by alcohol use, age, and sex. Demographic characteristics, alcohol consumption, and presence of ICD-10 dependence were recorded using an interview schedule based on the AUDADIS, CDT was assayed using CDTect(TM) and GGT and AST by standard methods. Statistical techniques included receiver operating characteristic (ROC) analysis. Multiple regression was used to measure the impact of factors other than alcohol on test performance. Results: CDT and GGT had comparable performance on ROC analysis, with AST performing slightly less well. CDT was a slightly but significantly better marker of high-risk consumption in men. All were more effective for detection of high-risk rather than intermediate-risk drinking. CDT and GGT levels were influenced by body mass index, sex, age, and smoking status. Conclusions: CDT was little better than GGT in detecting high- or intermediate-risk alcohol consumption in this large, multicenter, predominantly community-based sample. As the two tests are relatively independent of each other, their combination is likely to provide better performance than either test alone, Test interpretation should take account sex, age. and body mass index.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Landscape metrics are widely applied in landscape ecology to quantify landscape structure. However, many are poorly tested and require rigorous validation if they are to serve as reliable indicators of habitat loss and fragmentation, such as Montreal Process Indicator 1.1e. We apply a landscape ecology theory, supported by exploratory and confirmatory statistical techniques, to empirically test landscape metrics for reporting Montreal Process Indicator 1.1e in continuous dry eucalypt forests of sub-tropical Queensland, Australia. Target biota examined included: the Yellow-bellied Glider (Petaurus australis); the diversity of nectar and sap feeding glider species including P. australis, the Sugar Glider P. breviceps, the Squirrel Glider P. norfolcensis, and the Feathertail Glider Acrobates pygmaeus; six diurnal forest birds species; total diurnal bird species diversity; and the density of nectar-feeding diurnal bird species. Two scales of influence were considered: the stand-scale (2 ha), and a series of radial landscape extents (500 m - 2 km; 78 - 1250 ha) surrounding each fauna transect. For all biota, stand-scale structural and compositional attributes were found to be more influential than landscape metrics. For the Yellow-bellied Glider, the proportion of trace habitats with a residual element of old spotted-gum/ironbark eucalypt trees was a significant landscape metric at the 2 km landscape extent. This is a measure of habitat loss rather than habitat fragmentation. For the diversity of nectar and sap feeding glider species, the proportion of trace habitats with a high coefficient of variation in patch size at the 750 m extent was a significant landscape metric. None of the landscape metrics tested was important for diurnal forest birds. We conclude that no single landscape metric adequately captures the response of the region's forest biota per se. This poses a major challenge to regional reporting of Montreal Process Indicator 1.1e, fragmentation of forest types.