840 resultados para Automated data analysis


Relevância:

90.00% 90.00%

Publicador:

Resumo:

ACM Computing Classification System (1998): D.2.11, D.1.3, D.3.1, J.3, C.2.4.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This study analyzed the health and overall landcover of citrus crops in Florida. The analysis was completed using Landsat satellite imagery available free of charge from the University of Maryland Global Landcover Change Facility. The project hypothesized that combining citrus production (economic) data with citrus area per county derived from spectral signatures would yield correlations between observable spectral reflectance throughout the year, and the fiscal impact of citrus on local economies. A positive correlation between these two data types would allow us to predict the economic impact of citrus using spectral data analysis to determine final crop harvests.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The purpose of this ethnographic study was to describe and explain the congruency of psychological preferences identified by the Myers-Briggs Type Indicator (MBTI) and the human resource development (HRD) role of instructor/facilitator. This investigation was conducted with 23 HRD professionals who worked in the Miami, Florida area as instructors/facilitators with adult learners in job-related contexts.^ The study was conducted using qualitative strategies of data collection and analysis. The research participants were selected through a purposive sampling strategy. Data collection strategies included: (a) administration and scoring of the MBTI, Form G, (b) open-ended and semi-structured interviews, (c) participant observations of the research subjects at their respective work sites and while conducting training sessions, (d) field notes, and (e) contact summary sheets to record field research encounters. Data analysis was conducted with the use of a computer program for qualitative analysis called FolioViews 3.1 for Windows. This included: (a) coding of transcribed interviews and field notes, (b) theme analysis, (c) memoing, and (d) cross-case analysis.^ The three major themes that emerged in relation to the congruency of psychological preferences and the role of instructor/facilitator were: (1) designing and preparing instruction/facilitation, (2) conducting training and managing group process, and (3) interpersonal relations and perspectives among instructors/facilitators.^ The first two themes were analyzed through the combination of the four Jungian personality functions. These combinations are: sensing-thinking (ST), sensing-feeling (SF), intuition-thinking (NT), and intuition-feeling (NF). The third theme was analyzed through the combination of the attitudes or energy focus and the judgment function. These combinations are: extraversion-thinking (ET), extraversion-feeling (EF), introversion-thinking (IT), and introversion-feeling (IF).^ A last area uncovered by this ethnographic study was the influence exerted by a training and development culture on the instructor/facilitator role. This professional culture is described and explained in terms of the shared values and expectations reported by the study respondents. ^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The purpose of this study was to document and critically analyze the lived experience of selected nursing staff developers in the process of moving toward a new model for hospital nursing education. Eleven respondents were drawn from a nation-wide population of about two hundred individuals involved in nursing staff development. These subjects were responsible for the implementation of the Performance Based Development System (PBDS) in their institutions.^ A purposive, criterion-based sampling technique was used with respondents being selected according to size of hospital, primary responsibility for orchestration of the change, influence over budgetary factors and managerial responsibility for PBDS. Data were gathered by the researcher through both in-person and telephone interviews. A semi-structured interview guide, designed by the researcher was used, and respondents were encouraged to amplify on their recollections as desired. Audiotapes were transcribed and resulting computer files were analyzed using the program "Martin". Answers to interview questions were compiled and reported across cases. The data was then reviewed a second time and interpreted for emerging themes and patterns.^ Two types of verification were used in the study. Internal verification was done through interview transcript review and feedback by respondents. External verification was done through review and feedback on data analysis by readers who were experienced in management of staff development departments.^ All respondents were female, so Gilligan's concept of the "ethic of care" was examined as a decision making strategy. Three levels of caring which influenced decision making were found. They were caring: (a) for the organization, (b) for the employee, and (c) for the patient. The four existentials of the lived experience, relationality, corporeality, temporality and spatiality were also examined to reveal the everydayness of making change. ^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The necessity of elemental analysis techniques to solve forensic problems continues to expand as the samples collected from crime scenes grow in complexity. Laser ablation ICP-MS (LA-ICP-MS) has been shown to provide a high degree of discrimination between samples that originate from different sources. In the first part of this research, two laser ablation ICP-MS systems were compared, one using a nanosecond laser and another a femtosecond laser source for the forensic analysis of glass. The results showed that femtosecond LA-ICP-MS did not provide significant improvements in terms of accuracy, precision and discrimination, however femtosecond LA-ICP-MS did provide lower detection limits. In addition, it was determined that even for femtosecond LA-ICP-MS an internal standard should be utilized to obtain accurate analytical results for glass analyses. In the second part, a method using laser induced breakdown spectroscopy (LIBS) for the forensic analysis of glass was shown to provide excellent discrimination for a glass set consisting of 41 automotive fragments. The discrimination power was compared to two of the leading elemental analysis techniques, μXRF and LA-ICP-MS, and the results were similar; all methods generated >99% discrimination and the pairs found indistinguishable were similar. An extensive data analysis approach for LIBS glass analyses was developed to minimize Type I and II errors en route to a recommendation of 10 ratios to be used for glass comparisons. Finally, a LA-ICP-MS method for the qualitative analysis and discrimination of gel ink sources was developed and tested for a set of ink samples. In the first discrimination study, qualitative analysis was used to obtain 95.6% discrimination for a blind study consisting of 45 black gel ink samples provided by the United States Secret Service. A 0.4% false exclusion (Type I) error rate and a 3.9% false inclusion (Type II) error rate was obtained for this discrimination study. In the second discrimination study, 99% discrimination power was achieved for a black gel ink pen set consisting of 24 self collected samples. The two pairs found to be indistinguishable came from the same source of origin (the same manufacturer and type of pen purchased in different locations). It was also found that gel ink from the same pen, regardless of the age, was indistinguishable as were gel ink pens (four pens) originating from the same pack.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Housing Partnerships (HPs) are collaborative arrangements that assist communities in the delivery of affordable housing by combining the strengths of the public and private sectors. They emerged in several states, counties, and cities in the eighties as innovative solutions to the challenges in affordable housing resulting from changing dynamics of delivery and production. ^ My study examines HPs with particular emphasis upon the identification of those factors associated with the successful performance of their mission of affordable housing. I will use the Balanced Scorecard (BSC) framework in this study. The identification of performance factors facilitates a better understanding of how HPs can be successful in achieving their mission. The identification of performance factors is significant in the context of the current economic environment because HPs can be viewed as innovative institutional mechanisms in the provision of affordable housing. ^ The present study uses a mixed methods research approach, drawing on data from the IRS Form 990 tax returns, a survey of the chief executives of HPs, and other secondary sources. The data analysis is framed according to the four perspectives of BSC: the financial, customer, internal business, and learning and growth. Financially, revenue diversification affects the financial health of HPs and overall performance. Although HPs depend on private and government funding, they also depend on service fees to carry out their mission. From a customer perspective, the HPs mainly serve low and moderate income households, although some serve specific groups such as seniors, homeless, veterans, and victims of domestic violence. From an internal business perspective, HPs’ programs are oriented toward affordable housing needs, undertaking not only traditional activities such as construction, loan provision, etc., but also advocacy and educational programs. From an employee and learning growth perspective, the HPs are small in staff size, but undertake a range of activities with the help of volunteers. Every part of the HP is developed to maximize resources, knowledge, and skills in order to assist communities in the delivery of affordable housing and related needs. Overall, housing partnerships have played a key role in affordable housing despite the housing market downturn since 2006. Their expenses on affordable housing activities increased despite the decrease in their revenues.^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Advanced Placement is a series of courses and tests designed to determine mastery over introductory college material. It has become part of the American educational system. The changing conception of AP was examined using critical theory to determine what led to a view of continual success. The study utilized David Armstrong's variation of Michel Foucault's critical theory to construct an analytical framework. Black and Ubbes' data gathering techniques and Braun and Clark's data analysis were utilized as the analytical framework. Data included 1135 documents: 641 journal articles, 421 newspaper articles and 82 government documents. ^ The study revealed three historical ruptures correlated to three themes containing subthemes. The first rupture was the Sputnik launch in 1958. Its correlated theme was AP leading to school reform with subthemes of AP as reform for able students and AP's gaining of acceptance from secondary schools and higher education. The second rupture was the Nation at Risk report published in 1983. Its correlated theme was AP's shift in emphasis from the exam to the course with the subthemes of AP as a course, a shift in AP's target population, using AP courses to promote equity, and AP courses modifying curricula. The passage of the No Child Left Behind Act of 2001 was the third rupture. Its correlated theme was AP as a means to narrow the achievement gap with the subthemes of AP as a college preparatory program and the shifting of AP to an open access program. ^ The themes revealed a perception that progressively integrated the program into American education. The AP program changed emphasis from tests to curriculum, and is seen as the nation's premier academic program to promote reform and prepare students for college. It has become a major source of income for the College Board. In effect, AP has become an agent of privatization, spurring other private entities into competition for government funding. The change and growth of the program over the past 57 years resulted in a deep integration into American education. As such the program remains an intrinsic part of the system and continues to evolve within American education. ^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Women are a high-risk population for cardiovascular diseases (CVD); however relationships between CVD and subpopulations of mothers are sparse. A secondary data analysis of the 2006 Health Survey of Adults and Children in Bermuda was conducted to compare the prevalence of CVD risk factors in single (n=77) and partnered (n=241) mothers. A higher percentage of single mothers were Black (p25 kg/m2 (p=0.01) and reported high blood pressure (p=0.004) and high cholesterol (0.017). Single mothers were nearly three times (OR=2.66) more likely to experience high blood pressure and two times (OR= 2.22) more likely to have high cholesterol. Single mothers may benefit from nutrition education programs related to lowering CVD risk.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Thermodynamic stability measurements on proteins and protein-ligand complexes can offer insights not only into the fundamental properties of protein folding reactions and protein functions, but also into the development of protein-directed therapeutic agents to combat disease. Conventional calorimetric or spectroscopic approaches for measuring protein stability typically require large amounts of purified protein. This requirement has precluded their use in proteomic applications. Stability of Proteins from Rates of Oxidation (SPROX) is a recently developed mass spectrometry-based approach for proteome-wide thermodynamic stability analysis. Since the proteomic coverage of SPROX is fundamentally limited by the detection of methionine-containing peptides, the use of tryptophan-containing peptides was investigated in this dissertation. A new SPROX-like protocol was developed that measured protein folding free energies using the denaturant dependence of the rate at which globally protected tryptophan and methionine residues are modified with dimethyl (2-hydroxyl-5-nitrobenzyl) sulfonium bromide and hydrogen peroxide, respectively. This so-called Hybrid protocol was applied to proteins in yeast and MCF-7 cell lysates and achieved a ~50% increase in proteomic coverage compared to probing only methionine-containing peptides. Subsequently, the Hybrid protocol was successfully utilized to identify and quantify both known and novel protein-ligand interactions in cell lysates. The ligands under study included the well-known Hsp90 inhibitor geldanamycin and the less well-understood omeprazole sulfide that inhibits liver-stage malaria. In addition to protein-small molecule interactions, protein-protein interactions involving Puf6 were investigated using the SPROX technique in comparative thermodynamic analyses performed on wild-type and Puf6-deletion yeast strains. A total of 39 proteins were detected as Puf6 targets and 36 of these targets were previously unknown to interact with Puf6. Finally, to facilitate the SPROX/Hybrid data analysis process and minimize human errors, a Bayesian algorithm was developed for transition midpoint assignment. In summary, the work in this dissertation expanded the scope of SPROX and evaluated the use of SPROX/Hybrid protocols for characterizing protein-ligand interactions in complex biological mixtures.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Energy efficiency and user comfort have recently become priorities in the Facility Management (FM) sector. This has resulted in the use of innovative building components, such as thermal solar panels, heat pumps, etc., as they have potential to provide better performance, energy savings and increased user comfort. However, as the complexity of components increases, the requirement for maintenance management also increases. The standard routine for building maintenance is inspection which results in repairs or replacement when a fault is found. This routine leads to unnecessary inspections which have a cost with respect to downtime of a component and work hours. This research proposes an alternative routine: performing building maintenance at the point in time when the component is degrading and requires maintenance, thus reducing the frequency of unnecessary inspections. This thesis demonstrates that statistical techniques can be used as part of a maintenance management methodology to invoke maintenance before failure occurs. The proposed FM process is presented through a scenario utilising current Building Information Modelling (BIM) technology and innovative contractual and organisational models. This FM scenario supports a Degradation based Maintenance (DbM) scheduling methodology, implemented using two statistical techniques, Particle Filters (PFs) and Gaussian Processes (GPs). DbM consists of extracting and tracking a degradation metric for a component. Limits for the degradation metric are identified based on one of a number of proposed processes. These processes determine the limits based on the maturity of the historical information available. DbM is implemented for three case study components: a heat exchanger; a heat pump; and a set of bearings. The identified degradation points for each case study, from a PF, a GP and a hybrid (PF and GP combined) DbM implementation are assessed against known degradation points. The GP implementations are successful for all components. For the PF implementations, the results presented in this thesis find that the extracted metrics and limits identify degradation occurrences accurately for components which are in continuous operation. For components which have seasonal operational periods, the PF may wrongly identify degradation. The GP performs more robustly than the PF, but the PF, on average, results in fewer false positives. The hybrid implementations, which are a combination of GP and PF results, are successful for 2 of 3 case studies and are not affected by seasonal data. Overall, DbM is effectively applied for the three case study components. The accuracy of the implementations is dependant on the relationships modelled by the PF and GP, and on the type and quantity of data available. This novel maintenance process can improve equipment performance and reduce energy wastage from BSCs operation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Advanced Placement is a series of courses and tests designed to determine mastery over introductory college material. It has become part of the American educational system. The changing conception of AP was examined using critical theory to determine what led to a view of continual success. The study utilized David Armstrong’s variation of Michel Foucault’s critical theory to construct an analytical framework. Black and Ubbes’ data gathering techniques and Braun and Clark’s data analysis were utilized as the analytical framework. Data included 1135 documents: 641 journal articles, 421 newspaper articles and 82 government documents. The study revealed three historical ruptures correlated to three themes containing subthemes. The first rupture was the Sputnik launch in 1958. Its correlated theme was AP leading to school reform with subthemes of AP as reform for able students and AP’s gaining of acceptance from secondary schools and higher education. The second rupture was the Nation at Risk report published in 1983. Its correlated theme was AP’s shift in emphasis from the exam to the course with the subthemes of AP as a course, a shift in AP’s target population, using AP courses to promote equity, and AP courses modifying curricula. The passage of the No Child Left Behind Act of 2001 was the third rupture. Its correlated theme was AP as a means to narrow the achievement gap with the subthemes of AP as a college preparatory program and the shifting of AP to an open access program. The themes revealed a perception that progressively integrated the program into American education. The AP program changed emphasis from tests to curriculum, and is seen as the nation’s premier academic program to promote reform and prepare students for college. It has become a major source of income for the College Board. In effect, AP has become an agent of privatization, spurring other private entities into competition for government funding. The change and growth of the program over the past 57 years resulted in a deep integration into American education. As such the program remains an intrinsic part of the system and continues to evolve within American education.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The necessity of elemental analysis techniques to solve forensic problems continues to expand as the samples collected from crime scenes grow in complexity. Laser ablation ICP-MS (LA-ICP-MS) has been shown to provide a high degree of discrimination between samples that originate from different sources. In the first part of this research, two laser ablation ICP-MS systems were compared, one using a nanosecond laser and another a femtosecond laser source for the forensic analysis of glass. The results showed that femtosecond LA-ICP-MS did not provide significant improvements in terms of accuracy, precision and discrimination, however femtosecond LA-ICP-MS did provide lower detection limits. In addition, it was determined that even for femtosecond LA-ICP-MS an internal standard should be utilized to obtain accurate analytical results for glass analyses. In the second part, a method using laser induced breakdown spectroscopy (LIBS) for the forensic analysis of glass was shown to provide excellent discrimination for a glass set consisting of 41 automotive fragments. The discrimination power was compared to two of the leading elemental analysis techniques, µXRF and LA-ICP-MS, and the results were similar; all methods generated >99% discrimination and the pairs found indistinguishable were similar. An extensive data analysis approach for LIBS glass analyses was developed to minimize Type I and II errors en route to a recommendation of 10 ratios to be used for glass comparisons. Finally, a LA-ICP-MS method for the qualitative analysis and discrimination of gel ink sources was developed and tested for a set of ink samples. In the first discrimination study, qualitative analysis was used to obtain 95.6% discrimination for a blind study consisting of 45 black gel ink samples provided by the United States Secret Service. A 0.4% false exclusion (Type I) error rate and a 3.9% false inclusion (Type II) error rate was obtained for this discrimination study. In the second discrimination study, 99% discrimination power was achieved for a black gel ink pen set consisting of 24 self collected samples. The two pairs found to be indistinguishable came from the same source of origin (the same manufacturer and type of pen purchased in different locations). It was also found that gel ink from the same pen, regardless of the age, was indistinguishable as were gel ink pens (four pens) originating from the same pack.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Safeguarding organizations against opportunism and severe deception in computer-mediated communication (CMC) presents a major challenge to CIOs and IT managers. New insights into linguistic cues of deception derive from the speech acts innate to CMC. Applying automated text analysis to archival email exchanges in a CMC system as part of a reward program, we assess the ability of word use (micro-level), message development (macro-level), and intertextual exchange cues (meta-level) to detect severe deception by business partners. We empirically assess the predictive ability of our framework using an ordinal multilevel regression model. Results indicate that deceivers minimize the use of referencing and self-deprecation but include more superfluous descriptions and flattery. Deceitful channel partners also over structure their arguments and rapidly mimic the linguistic style of the account manager across dyadic e-mail exchanges. Thanks to its diagnostic value, the proposed framework can support firms’ decision-making and guide compliance monitoring system development.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background There is increasing interest in how culture may affect the quality of healthcare services, and previous research has shown that ‘treatment culture’—of which there are three categories (resident centred, ambiguous and traditional)—in a nursing home may influence prescribing of psychoactive medications. Objective The objective of this study was to explore and understand treatment culture in prescribing of psychoactive medications for older people with dementia in nursing homes. Method Six nursing homes—two from each treatment culture category—participated in this study. Qualitative data were collected through semi-structured interviews with nursing home staff and general practitioners (GPs), which sought to determine participants’ views on prescribing and administration of psychoactive medication, and their understanding of treatment culture and its potential influence on prescribing of psychoactive drugs. Following verbatim transcription, the data were analysed and themes were identified, facilitated by NVivo and discussion within the research team. Results Interviews took place with five managers, seven nurses, 13 care assistants and two GPs. Four themes emerged: the characteristics of the setting, the characteristics of the individual, relationships and decision making. The characteristics of the setting were exemplified by views of the setting, daily routines and staff training. The characteristics of the individual were demonstrated by views on the personhood of residents and staff attitudes. Relationships varied between staff within and outside the home. These relationships appeared to influence decision making about prescribing of medications. The data analysis found that each home exhibited traits that were indicative of its respective assigned treatment culture. Conclusion Nursing home treatment culture appeared to be influenced by four main themes. Modification of these factors may lead to a shift in culture towards a more flexible, resident-centred culture and a reduction in prescribing and use of psychoactive medication. 

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Tide gauge data are identified as legacy data given the radical transition between observation method and required output format associated with tide gauges over the 20th-century. Observed water level variation through tide-gauge records is regarded as the only significant basis for determining recent historical variation (decade to century) in mean sea-level and storm surge. There are limited tide gauge records that cover the 20th century, such that the Belfast (UK) Harbour tide gauge would be a strategic long-term (110 years) record, if the full paper-based records (marigrams) were digitally restructured to allow for consistent data analysis. This paper presents the methodology of extracting a consistent time series of observed water levels from the 5 different Belfast Harbour tide gauges’ positions/machine types, starting late 1901. Tide-gauge data was digitally retrieved from the original analogue (daily) records by scanning the marigrams and then extracting the sequential tidal elevations with graph-line seeking software (Ungraph™). This automation of signal extraction allowed the full Belfast series to be retrieved quickly, relative to any manual x–y digitisation of the signal. Restructuring variably lengthed tidal data sets to a consistent daily, monthly and annual file format was undertaken by project-developed software: Merge&Convert and MergeHYD allow consistent water level sampling both at 60 min (past standard) and 10 min intervals, the latter enhancing surge measurement. Belfast tide-gauge data have been rectified, validated and quality controlled (IOC 2006 standards). The result is a consistent annual-based legacy data series for Belfast Harbour that includes over 2 million tidal-level data observations.