928 resultados para automation of fit analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In an effort to improve instruction and better accommodate the needs of students, community colleges are offering courses delivered in a variety of delivery formats that require students to have some level of technology fluency to be successful in the course. This study was conducted to investigate the relationship between student socioeconomic status (SES), course delivery method, and course type on enrollment, final course grades, course completion status, and course passing status at a state college. ^ A dataset for 20,456 students of low and not low SES enrolled in science, technology, engineering, and mathematics (STEM) course types delivered using traditional, online, blended, and web enhanced course delivery formats at Miami Dade College, a large open access 4-year state college located in Miami-Dade County, Florida, was analyzed. A factorial ANOVA using course type, course delivery method, and student SES found no significant differences in final course grades when used to determine if course delivery methods were equally effective for students of low and not low SES taking STEM course types. Additionally, three chi-square goodness-of-fit tests were used to investigate for differences in enrollment, course completion and course passing status by SES, course type, and course delivery method. The findings of the chi-square tests indicated that: (a) there were significant differences in enrollment by SES and course delivery methods for the Engineering/Technology, Math, and overall course types but not for the Natural Science course type and (b) there were no significant differences in course completion status and course passing status by SES and course types overall and SES and course delivery methods overall. However, there were statistically significant but weak relationships between course passing status, SES and the math course type as well as between course passing status, SES, and online and traditional course delivery methods. ^ The mixed findings in the study indicate that strides have been made in closing the theoretical gap in education and technology skills that may exist for students of different SES levels. MDC's course delivery and student support models may assist other institutions address student success in courses that necessitate students having some level of technology fluency. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The importance of checking the normality assumption in most statistical procedures especially parametric tests cannot be over emphasized as the validity of the inferences drawn from such procedures usually depend on the validity of this assumption. Numerous methods have been proposed by different authors over the years, some popular and frequently used, others, not so much. This study addresses the performance of eighteen of the available tests for different sample sizes, significance levels, and for a number of symmetric and asymmetric distributions by conducting a Monte-Carlo simulation. The results showed that considerable power is not achieved for symmetric distributions when sample size is less than one hundred and for such distributions, the kurtosis test is most powerful provided the distribution is leptokurtic or platykurtic. The Shapiro-Wilk test remains the most powerful test for asymmetric distributions. We conclude that different tests are suitable under different characteristics of alternative distributions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The L-moments based index-flood procedure had been successfully applied for Regional Flood Frequency Analysis (RFFA) for the Island of Newfoundland in 2002 using data up to 1998. This thesis, however, considered both Labrador and the Island of Newfoundland using the L-Moments index-flood method with flood data up to 2013. For Labrador, the homogeneity test showed that Labrador can be treated as a single homogeneous region and the generalized extreme value (GEV) was found to be more robust than any other frequency distributions. The drainage area (DA) is the only significant variable for estimating the index-flood at ungauged sites in Labrador. In previous studies, the Island of Newfoundland has been considered as four homogeneous regions (A,B,C and D) as well as two Water Survey of Canada's Y and Z sub-regions. Homogeneous regions based on Y and Z was found to provide more accurate quantile estimates than those based on four homogeneous regions. Goodness-of-fit test results showed that the generalized extreme value (GEV) distribution is most suitable for the sub-regions; however, the three-parameter lognormal (LN3) gave a better performance in terms of robustness. The best fitting regional frequency distribution from 2002 has now been updated with the latest flood data, but quantile estimates with the new data were not very different from the previous study. Overall, in terms of quantile estimation, in both Labrador and the Island of Newfoundland, the index-flood procedure based on L-moments is highly recommended as it provided consistent and more accurate result than other techniques such as the regression on quantile technique that is currently used by the government.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation goes into the new field from applied linguistics called forensic linguistics, which studies the language as an evidence for criminal cases. There are many subfields within forensic linguistics, however, this study belongs to authorship attribution analysis, where the authorship of a text is attributed to an author through an exhaustive linguistic analysis. Within this field, this study analyzes the morphosyntactic and discursive-pragmatic variables that remain constant in the intra-variation or personal style of a speaker in the oral and written discourse, and at the same time have a high difference rate in the interspeaker variation, or from one speaker to another. The theoretical base of this study is the term used by professor Maria Teresa Turell called “idiolectal style”. This term establishes that the idiosyncratic choices that the speaker makes from the language build a style for each speaker that is constant in the intravariation of the speaker’s discourse. This study comes as a consequence of the problem appeared in authorship attribution analysis, where the absence of some known texts impedes the analysis for the attribution of the authorship of an uknown text. Thus, through a methodology based on qualitative analysis, where the variables are studied exhaustively, and on quantitative analysis, where the findings from qualitative analysis are statistically studied, some conclusions on the evidence of such variables in both oral and written discourses will be drawn. The results of this analysis will lead to further implications on deeper analyses where larger amount of data can be used.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aircraft manufacturing industries are looking for solutions in order to increase their productivity. One of the solutions is to apply the metrology systems during the production and assembly processes. Metrology Process Model (MPM) (Maropoulos et al, 2007) has been introduced which emphasises metrology applications with assembly planning, manufacturing processes and product designing. Measurability analysis is part of the MPM and the aim of this analysis is to check the feasibility for measuring the designed large scale components. Measurability Analysis has been integrated in order to provide an efficient matching system. Metrology database is structured by developing the Metrology Classification Model. Furthermore, the feature-based selection model is also explained. By combining two classification models, a novel approach and selection processes for integrated measurability analysis system (MAS) are introduced and such integrated MAS could provide much more meaningful matching results for the operators. © Springer-Verlag Berlin Heidelberg 2010.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis looks at how non-experts develop an opinion on climate change, and how those opinions could be changed by public discourse. I use Hubert Dreyfus’ account of skill acquisition to distinguish between experts and non-experts. I then use a combination of Walter Fisher’s narrative paradigm and the hermeneutics of Paul Ricœur to explore how non-experts form opinions, and how public narratives can provide a point of critique. In order to develop robust narratives, they must be financially realistic. I therefore consider the burgeoning field of environmental, social, and corporate governance (ESG) analysis as a way of informing realistic public narratives. I identify a potential problem with this approach: the Western assumptions of ESG analysis might make for public narratives that are not convincing to a non-Western audience. I then demonstrate how elements of the Chinese tradition, the Confucian, Neo-Confucian, and Daoist schools, as presented by David Hall and Roger Ames, can provide alternative assumptions to ESG analysis so that the public narratives will be more culturally adaptable. This research contributes to the discipline by bringing disparate traditions together in a unique way, into a practical project with a view towards applications. I conclude by considering avenues for further research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The necessity of elemental analysis techniques to solve forensic problems continues to expand as the samples collected from crime scenes grow in complexity. Laser ablation ICP-MS (LA-ICP-MS) has been shown to provide a high degree of discrimination between samples that originate from different sources. In the first part of this research, two laser ablation ICP-MS systems were compared, one using a nanosecond laser and another a femtosecond laser source for the forensic analysis of glass. The results showed that femtosecond LA-ICP-MS did not provide significant improvements in terms of accuracy, precision and discrimination, however femtosecond LA-ICP-MS did provide lower detection limits. In addition, it was determined that even for femtosecond LA-ICP-MS an internal standard should be utilized to obtain accurate analytical results for glass analyses. In the second part, a method using laser induced breakdown spectroscopy (LIBS) for the forensic analysis of glass was shown to provide excellent discrimination for a glass set consisting of 41 automotive fragments. The discrimination power was compared to two of the leading elemental analysis techniques, µXRF and LA-ICP-MS, and the results were similar; all methods generated >99% discrimination and the pairs found indistinguishable were similar. An extensive data analysis approach for LIBS glass analyses was developed to minimize Type I and II errors en route to a recommendation of 10 ratios to be used for glass comparisons. Finally, a LA-ICP-MS method for the qualitative analysis and discrimination of gel ink sources was developed and tested for a set of ink samples. In the first discrimination study, qualitative analysis was used to obtain 95.6% discrimination for a blind study consisting of 45 black gel ink samples provided by the United States Secret Service. A 0.4% false exclusion (Type I) error rate and a 3.9% false inclusion (Type II) error rate was obtained for this discrimination study. In the second discrimination study, 99% discrimination power was achieved for a black gel ink pen set consisting of 24 self collected samples. The two pairs found to be indistinguishable came from the same source of origin (the same manufacturer and type of pen purchased in different locations). It was also found that gel ink from the same pen, regardless of the age, was indistinguishable as were gel ink pens (four pens) originating from the same pack.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mineralogical, geochemical, magnetic, and siliciclastic grain-size signatures of 34 surface sediment samples from the Mackenzie-Beaufort Sea Slope and Amundsen Gulf were studied in order to better constrain the redox status, detrital particle provenance, and sediment dynamics in the western Canadian Arctic. Redox-sensitive elements (Mn, Fe, V, Cr, Zn) indicate that modern sedimentary deposition within the Mackenzie-Beaufort Sea Slope and Amundsen Gulf took place under oxic bottom-water conditions, with more turbulent mixing conditions and thus a well-oxygenated water column prevailing within the Amundsen Gulf. The analytical data obtained, combined with multivariate statistical (notably, principal component and fuzzy c-means clustering analyses) and spatial analyses, allowed the division of the study area into four provinces with distinct sedimentary compositions: (1) the Mackenzie Trough-Canadian Beaufort Shelf with high phyllosilicate-Fe oxide-magnetite and Al-K-Ti-Fe-Cr-V-Zn-P contents; (2) Southwestern Banks Island, characterized by high dolomite-K-feldspar and Ca-Mg-LOI contents; (3) the Central Amundsen Gulf, a transitional zone typified by intermediate phyllosilicate-magnetite-K-feldspar-dolomite and Al-K-Ti-Fe-Mn-V-Zn-Sr-Ca-Mg-LOI contents; and (4) mud volcanoes on the Canadian Beaufort Shelf distinguished by poorly sorted coarse-silt with high quartz-plagioclase-authigenic carbonate and Si-Zr contents, as well as high magnetic susceptibility. Our results also confirm that the present-day sedimentary dynamics on the Canadian Beaufort Shelf is mainly controlled by sediment supply from the Mackenzie River. Overall, these insights provide a basis for future studies using mineralogical, geochemical, and magnetic signatures of Canadian Arctic sediments in order to reconstruct past variations in sediment inputs and transport pathways related to late Quaternary climate and oceanographic changes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The inherent analogue nature of medical ultrasound signals in conjunction with the abundant merits provided by digital image acquisition, together with the increasing use of relatively simple front-end circuitries, have created considerable demand for single-bit  beamformers in digital ultrasound imaging systems. Furthermore, the increasing need to design lightweight ultrasound systems with low power consumption and low noise, provide ample justification for development and innovation in the use of single-bit  beamformers in ultrasound imaging systems. The overall aim of this research program is to investigate, establish, develop and confirm through a combination of theoretical analysis and detailed simulations, that utilize raw phantom data sets, suitable techniques for the design of simple-to-implement hardware efficient  digital ultrasound beamformers to address the requirements for 3D scanners with large channel counts, as well as portable and lightweight ultrasound scanners for point-of-care applications and intravascular imaging systems. In addition, the stability boundaries of higher-order High-Pass (HP) and Band-Pass (BP) Σ−Δ modulators for single- and dual- sinusoidal inputs are determined using quasi-linear modeling together with the describing-function method, to more accurately model the  modulator quantizer. The theoretical results are shown to be in good agreement with the simulation results for a variety of input amplitudes, bandwidths, and modulator orders. The proposed mathematical models of the quantizer will immensely help speed up the design of higher order HP and BP Σ−Δ modulators to be applicable for digital ultrasound beamformers. Finally, a user friendly design and performance evaluation tool for LP, BP and HP  modulators is developed. This toolbox, which uses various design methodologies and covers an assortment of  modulators topologies, is intended to accelerate the design process and evaluation of  modulators. This design tool is further developed to enable the design, analysis and evaluation of  beamformer structures including the noise analyses of the final B-scan images. Thus, this tool will allow researchers and practitioners to design and verify different reconstruction filters and analyze the results directly on the B-scan ultrasound images thereby saving considerable time and effort.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Internet and the Web have changed the way that companies communicate with their publics, improving relations between them. Also providing substantial benefits for organizations. This has led to small and medium enterprises (SMEs) to develop corporate sites to establish relationships with their audiences. This paper, applying the methodology of content analysis, analyzes the main factors and tools that make the Websites usable and intuitive sites that promote better relations between SMEs and their audiences. Also, it has developed an index to measure the effectiveness of Webs from the perspective of usability. The results indicate that the Websites have, in general, appropriate levels of usability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We formally compare fundamental factor and latent factor approaches to oil price modelling. Fundamental modelling has a long history in seeking to understand oil price movements, while latent factor modelling has a more recent and limited history, but has gained popularity in other financial markets. The two approaches, though competing, have not formally been compared as to effectiveness. For a range of short- medium- and long-dated WTI oil futures we test a recently proposed five-factor fundamental model and a Principal Component Analysis latent factor model. Our findings demonstrate that there is no discernible difference between the two techniques in a dynamic setting. We conclude that this infers some advantages in adopting the latent factor approach due to the difficulty in determining a well specified fundamental model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

SYSTEMATIC REVIEW AND META-ANALYSIS: EFFECTS OF WALKING EXERCISE IN CHRONIC MUSCULOSKELETAL PAIN O'Connor S.R.1, Tully M.A.2, Ryan B.3, Baxter D.G.3, Bradley J.M.1, McDonough S.M.11University of Ulster, Health & Rehabilitation Sciences Research Institute, Newtownabbey, United Kingdom, 2Queen's University, UKCRC Centre of Excellence for Public Health (NI), Belfast, United Kingdom, 3University of Otago, Centre for Physiotherapy Research, Dunedin, New ZealandPurpose: To examine the effects of walking exercise on pain and self-reported function in adults with chronic musculoskeletal pain.Relevance: Chronic musculoskeletal pain is a major cause of morbidity, exerting a substantial influence on long-term health status and overall quality of life. Current treatment recommendations advocate various aerobic exercise interventions for such conditions. Walking may represent an ideal form of exercise due to its relatively low impact. However, there is currently limited evidence for its effectiveness.Participants: Not applicable.Methods: A comprehensive search strategy was undertaken by two independent reviewers according to the preferred reporting items for systematic reviews and meta-analyses (PRISMA) and the recommendations of the Cochrane Musculoskeletal Review Group. Six electronic databases (Medline, CINAHL, PsychINFO, PEDro, Sport DISCUS and the Cochrane Central Register of Controlled Trials) were searched for relevant papers published up to January 2010 using MeSH terms. All randomised or non-randomised studies published in full were considered for inclusion. Studies were required to include adults aged 18 years or over with a diagnosis of chronic low back pain, osteoarthritis or fibromyalgia. Studies were excluded if they involved peri-operative or post-operative interventions or did not include a comparative, non exercise or non-walking exercise control group. The U.S. Preventative Services Task Force system was used to assess methodological quality. Data for pain and self-reported function were extracted and converted to a score out of 100.Analysis: Data were pooled and analyzed using RevMan (v.5.0.24). Statistical heterogeneity was assessed using the X2 and I2 test statistics. A random effects model was used to calculate the mean differences and 95% CIs. Data were analyzed by length of final follow-up which was categorized as short (≤8 weeks post randomisation), mid (2-12 months) or long-term (>12 months).Results: A total of 4324 articles were identified and twenty studies (1852 participants) meeting the inclusion criteria were included in the review. Overall, studies were judged to be of at least fair methodological quality. The most common sources of likely bias were identified as lack of concealed allocation and failure to adequately address incomplete data. Data from 12 studies were suitable for meta-analysis. Walking led to reductions in pain at short (<8 weeks post randomisation) (-8.44 [-14.54, -2.33]) and mid-term (>8 weeks - 12 month) follow-up (-9.28 [-16.34, -2.22]). No effect was observed for long-term (>12 month) data (-2.49 [-7.62, 2.65]). For function, between group differences were observed for short (-11.57 [-16.06, -7.08]) and mid-term data (-13.26 [-16.91, -9.62]). A smaller effect was also observed at long-term follow-up (-5.60 [-7.70, -3.50]).Conclusions: Walking interventions were associated with statistically significant improvements in pain and function at short and mid-term follow-up. Long-term data were limited but indicated that these effects do not appear to be maintained beyond twelve months.Implications: Walking may be an effective form of exercise for individuals with chronic musculoskeletal pain. However, further research is required which examines longer term follow-up and dose-response issues in this population.Key-words: 1. Walking exercise 2. Musculoskeletal pain 3. Systematic reviewFunding acknowledgements: Department of Employment and Learning, Northern Ireland.Ethics approval: Not applicable.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Inverse analysis for reactive transport of chlorides through concrete in the presence of electric field is presented. The model is solved using MATLAB’s built-in solvers “pdepe.m” and “ode15s.m”. The results from the model are compared with experimental measurements from accelerated migration test and a function representing the lack of fit is formed. This function is optimised with respect to varying amount of key parameters defining the model. Levenberg-Marquardt trust-region optimisation approach is employed. The paper presents a method by which the degree of inter-dependency between parameters and sensitivity (significance) of each parameter towards model predictions can be studied on models with or without clearly defined governing equations. Eigen value analysis of the Hessian matrix was employed to investigate and avoid over-parametrisation in inverse analysis. We investigated simultaneous fitting of parameters for diffusivity, chloride binding as defined by Freundlich isotherm (thermodynamic) and binding rate (kinetic parameter). Fitting of more than 2 parameters, simultaneously, demonstrates a high degree of parameter inter-dependency. This finding is significant as mathematical models for representing chloride transport rely on several parameters for each mode of transport (i.e., diffusivity, binding, etc.), which combined may lead to unreliable simultaneous estimation of parameters.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many countries have set challenging wind power targets to achieve by 2020. This paper implements a realistic analysis of curtailment and constraint of wind energy at a nodal level using a unit commitment and economic dispatch model of the Irish Single Electricity Market in 2020. The key findings show that significant reduction in curtailment can be achieved when the system non-synchronous penetration limit increases from 65% to 75%. For the period analyzed, this results in a decreased total generation cost and a reduction in the dispatch-down of wind. However, some nodes experience significant dispatch-down of wind, which can be in the order of 40%. This work illustrates the importance of implementing analysis at a nodal level for the purpose of power system planning.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Typologies have represented an important tool for the development of comparative social policy research and continue to be widely used in spite of growing criticism of their ability to capture the complexity of welfare states and their internal heterogeneity. In particular, debates have focused on the presence of hybrid cases and the existence of distinct cross-national pattern of variation across areas of social policy. There is growing awareness around these issues, but empirical research often still relies on methodologies aimed at classifying countries in a limited number of unambiguous types. This article proposes a two-step approach based on fuzzy-set-ideal-type analysis for the systematic analysis of hybrids at the level of both policies (step 1) and policy configurations or combinations of policies (step 2). This approach is demonstrated by using the case of childcare policies in European economies. In the first step, parental leave policies are analysed using three methods – direct, indirect, and combinatory – to identify and describe specific hybrid forms at the level of policy analysis. In the second step, the analysis focus on the relationship between parental leave and childcare services in order to develop an overall typology of childcare policies, which clearly shows that many countries display characteristics normally associated with different types (hybrids and. Therefore, this two-step approach enhances our ability to account and make sense of hybrid welfare forms produced from tensions and contradictions within and between policies.