930 resultados para Multidimensional data analysis


Relevância:

90.00% 90.00%

Publicador:

Resumo:

TDA/H is usually considered among the most frequent psychological malfunctions in both childhood and adolescence. It covers a complex combination of neurocognitive deficits leading to developmental troubles linked to attention failure, hyperactivity and impulsivity. On the other hand, diagnosis of TDA/H is frequently a hard task, since sociocultural aspects concerning the evaluation of symptoms lead to some etiologic vagueness. Additionally, the large extent of evaluation tools, together with the diversity of therapeutic approaches referred by specialized literature justify the interest of investigating the diverse ways of diagnosing and treating TDA/H by medical doctors, psychologists and psycho-pedagogues developing professional activities in Natal-RN (Brazil) in the assistance of children and teenagers with TDA/H diagnosis hypothesis. A sample of thirty-four professionals participated in this study in a convenience-basis, and submitted to a semi-directed interview. Information from this procedure was analyzed, categorized and submitted to a multidimensional descriptive analysis (cluster analysis procedure), allowing to verify the partition of the sample in two groups: Group 1, basically composed by medical professionals, and Group 2, composed by psychologists and psycho-pedagogues. The categorized variable “Number of sessions” – average time used for arriving to a diagnosis – was the partition-variable showing the larger amount of statistical contribution for the partition, followed by the variables “Professional formation” and “Use of diagnostic tools”. Variables such “Comorbidity”, “TDA/H Definition” and Modalities of Intervention” also showed contribution to the partition obtained, even though their lesser amount of statistical contribution. Despite some similarity between these two groups, data allowed to demonstrate specific association between academic source-formation of the professional concerned and diagnosis and intervention modalities shown by these professionals when dealing with TDA/H. These data confirm relevant heterogeneity in dealing with TDA/H due to professional formation of professionals involved in diagnosis and treatment tasks.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

TDA/H is usually considered among the most frequent psychological malfunctions in both childhood and adolescence. It covers a complex combination of neurocognitive deficits leading to developmental troubles linked to attention failure, hyperactivity and impulsivity. On the other hand, diagnosis of TDA/H is frequently a hard task, since sociocultural aspects concerning the evaluation of symptoms lead to some etiologic vagueness. Additionally, the large extent of evaluation tools, together with the diversity of therapeutic approaches referred by specialized literature justify the interest of investigating the diverse ways of diagnosing and treating TDA/H by medical doctors, psychologists and psycho-pedagogues developing professional activities in Natal-RN (Brazil) in the assistance of children and teenagers with TDA/H diagnosis hypothesis. A sample of thirty-four professionals participated in this study in a convenience-basis, and submitted to a semi-directed interview. Information from this procedure was analyzed, categorized and submitted to a multidimensional descriptive analysis (cluster analysis procedure), allowing to verify the partition of the sample in two groups: Group 1, basically composed by medical professionals, and Group 2, composed by psychologists and psycho-pedagogues. The categorized variable “Number of sessions” – average time used for arriving to a diagnosis – was the partition-variable showing the larger amount of statistical contribution for the partition, followed by the variables “Professional formation” and “Use of diagnostic tools”. Variables such “Comorbidity”, “TDA/H Definition” and Modalities of Intervention” also showed contribution to the partition obtained, even though their lesser amount of statistical contribution. Despite some similarity between these two groups, data allowed to demonstrate specific association between academic source-formation of the professional concerned and diagnosis and intervention modalities shown by these professionals when dealing with TDA/H. These data confirm relevant heterogeneity in dealing with TDA/H due to professional formation of professionals involved in diagnosis and treatment tasks.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Parce qu’il est notamment lié à des facteurs de réussite scolaire et d’adaptation sociale (Eccles & Roeser, 2009; Finn, 1989; Janosz, Georges, & Parent, 1998), le sentiment d’appartenance des élèves est considéré comme étant un élément de première instance qui doit d’être développé et maintenu par les professionnels de l’éducation (MELS, 2012). L'objectif général visait à approfondir notre compréhension du sentiment d’appartenance à l’école. Pour répondre à cet objectif général, trois articles de recherche distincts ont été élaborés. Le premier article présente une analyse conceptuelle visant à clarifier la compréhension du concept de sentiment d’appartenance à l’école. La méthode conceptuelle privilégiée dans cet article est celle de Walker et Avant (2011). La recension des écrits et les référents empiriques répertoriés indiquent que ce concept est de nature multidimensionnelle. L’analyse des données indique quatre attributs définitionnels. L’élève doit : (1) ressentir une émotion positive à l’égard du milieu scolaire; (2) entretenir des relations sociales de qualité avec les membres du milieu scolaire; (3) s’impliquer activement dans les activités de la classe ou celles de l’école; (4) percevoir une certaine synergie (harmonisation), voir même une similarité, avec les membres de son groupe. À la suite de cette étude permettant de mieux comprendre le sentiment d’appartenance à l’école, le deuxième article visait à examiner la structure factorielle et l'invariance de l’instrument de mesure du sentiment d’appartenance Psychological Sense of School Membership (PSSM) au regard du sexe des élèves. Cette étude a été menée chez un échantillon composé de 766 filles et de 391 garçons de troisième secondaire. Les analyses factorielles confirmatoires ont indiqué une structure à trois facteurs : (1) la qualité des relations entre les élèves; (2) la qualité des relations entre les élèves et l’enseignant; ainsi que (3) le sentiment d’acceptation par le milieu. Les analyses factorielles multigroupes ont indiqué pour leur part que le PSSM est un instrument invariant chez les filles et les garçons de troisième secondaire. Finalement, le troisième article a été mené chez un échantillon de 4166 élèves de niveau secondaire afin d’examiner les processus psychologiques complexes s’opérant entre le sentiment d’appartenance et le rendement scolaire (Anderman & Freeman, 2004; Connell & et al., 1994; Roeser et al., 1996). Afin d’examiner ces processus psychologiques, quatre hypothèses issues du modèle de Freeman-Anderman ont été validées par le biais d’analyses acheminatoires : H1 Les affects positifs médiatisent partiellement et positivement l’effet du sentiment d’appartenance sur l’engagement comportemental; H2 Les affects positifs médiatisent partiellement et positivement l’effet du sentiment d’appartenance sur l’engagement affectif; H3 Les affects positifs médiatisent partiellement et positivement l’effet du sentiment d’appartenance sur l’engagement cognitif; H4 Les engagements affectif, cognitif et comportemental médiatisent partiellement et positivement l’effet du sentiment d’appartenance sur le rendement scolaire. Nos résultats appuient partiellement la première hypothèse de recherche tout en soutenant les hypothèses deux, trois et quatre. Spécifiquement, la relation entre le sentiment d’appartenance et l’engagement émotionnel montre davantage un effet direct qu’un effet indirect (H2). L’étude a produit des résultats similaires pour l’engagement cognitif (H3). Finalement, la relation entre le sentiment d’appartenance et le rendement scolaire indique un effet indirect plus grand qu’un effet direct (H4). À la lumière de ces résultats, des recommandations à l’intention des professionnels de l’éducation sont offertes en guise de conclusion.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Thermodynamic stability measurements on proteins and protein-ligand complexes can offer insights not only into the fundamental properties of protein folding reactions and protein functions, but also into the development of protein-directed therapeutic agents to combat disease. Conventional calorimetric or spectroscopic approaches for measuring protein stability typically require large amounts of purified protein. This requirement has precluded their use in proteomic applications. Stability of Proteins from Rates of Oxidation (SPROX) is a recently developed mass spectrometry-based approach for proteome-wide thermodynamic stability analysis. Since the proteomic coverage of SPROX is fundamentally limited by the detection of methionine-containing peptides, the use of tryptophan-containing peptides was investigated in this dissertation. A new SPROX-like protocol was developed that measured protein folding free energies using the denaturant dependence of the rate at which globally protected tryptophan and methionine residues are modified with dimethyl (2-hydroxyl-5-nitrobenzyl) sulfonium bromide and hydrogen peroxide, respectively. This so-called Hybrid protocol was applied to proteins in yeast and MCF-7 cell lysates and achieved a ~50% increase in proteomic coverage compared to probing only methionine-containing peptides. Subsequently, the Hybrid protocol was successfully utilized to identify and quantify both known and novel protein-ligand interactions in cell lysates. The ligands under study included the well-known Hsp90 inhibitor geldanamycin and the less well-understood omeprazole sulfide that inhibits liver-stage malaria. In addition to protein-small molecule interactions, protein-protein interactions involving Puf6 were investigated using the SPROX technique in comparative thermodynamic analyses performed on wild-type and Puf6-deletion yeast strains. A total of 39 proteins were detected as Puf6 targets and 36 of these targets were previously unknown to interact with Puf6. Finally, to facilitate the SPROX/Hybrid data analysis process and minimize human errors, a Bayesian algorithm was developed for transition midpoint assignment. In summary, the work in this dissertation expanded the scope of SPROX and evaluated the use of SPROX/Hybrid protocols for characterizing protein-ligand interactions in complex biological mixtures.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Energy efficiency and user comfort have recently become priorities in the Facility Management (FM) sector. This has resulted in the use of innovative building components, such as thermal solar panels, heat pumps, etc., as they have potential to provide better performance, energy savings and increased user comfort. However, as the complexity of components increases, the requirement for maintenance management also increases. The standard routine for building maintenance is inspection which results in repairs or replacement when a fault is found. This routine leads to unnecessary inspections which have a cost with respect to downtime of a component and work hours. This research proposes an alternative routine: performing building maintenance at the point in time when the component is degrading and requires maintenance, thus reducing the frequency of unnecessary inspections. This thesis demonstrates that statistical techniques can be used as part of a maintenance management methodology to invoke maintenance before failure occurs. The proposed FM process is presented through a scenario utilising current Building Information Modelling (BIM) technology and innovative contractual and organisational models. This FM scenario supports a Degradation based Maintenance (DbM) scheduling methodology, implemented using two statistical techniques, Particle Filters (PFs) and Gaussian Processes (GPs). DbM consists of extracting and tracking a degradation metric for a component. Limits for the degradation metric are identified based on one of a number of proposed processes. These processes determine the limits based on the maturity of the historical information available. DbM is implemented for three case study components: a heat exchanger; a heat pump; and a set of bearings. The identified degradation points for each case study, from a PF, a GP and a hybrid (PF and GP combined) DbM implementation are assessed against known degradation points. The GP implementations are successful for all components. For the PF implementations, the results presented in this thesis find that the extracted metrics and limits identify degradation occurrences accurately for components which are in continuous operation. For components which have seasonal operational periods, the PF may wrongly identify degradation. The GP performs more robustly than the PF, but the PF, on average, results in fewer false positives. The hybrid implementations, which are a combination of GP and PF results, are successful for 2 of 3 case studies and are not affected by seasonal data. Overall, DbM is effectively applied for the three case study components. The accuracy of the implementations is dependant on the relationships modelled by the PF and GP, and on the type and quantity of data available. This novel maintenance process can improve equipment performance and reduce energy wastage from BSCs operation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Advanced Placement is a series of courses and tests designed to determine mastery over introductory college material. It has become part of the American educational system. The changing conception of AP was examined using critical theory to determine what led to a view of continual success. The study utilized David Armstrong’s variation of Michel Foucault’s critical theory to construct an analytical framework. Black and Ubbes’ data gathering techniques and Braun and Clark’s data analysis were utilized as the analytical framework. Data included 1135 documents: 641 journal articles, 421 newspaper articles and 82 government documents. The study revealed three historical ruptures correlated to three themes containing subthemes. The first rupture was the Sputnik launch in 1958. Its correlated theme was AP leading to school reform with subthemes of AP as reform for able students and AP’s gaining of acceptance from secondary schools and higher education. The second rupture was the Nation at Risk report published in 1983. Its correlated theme was AP’s shift in emphasis from the exam to the course with the subthemes of AP as a course, a shift in AP’s target population, using AP courses to promote equity, and AP courses modifying curricula. The passage of the No Child Left Behind Act of 2001 was the third rupture. Its correlated theme was AP as a means to narrow the achievement gap with the subthemes of AP as a college preparatory program and the shifting of AP to an open access program. The themes revealed a perception that progressively integrated the program into American education. The AP program changed emphasis from tests to curriculum, and is seen as the nation’s premier academic program to promote reform and prepare students for college. It has become a major source of income for the College Board. In effect, AP has become an agent of privatization, spurring other private entities into competition for government funding. The change and growth of the program over the past 57 years resulted in a deep integration into American education. As such the program remains an intrinsic part of the system and continues to evolve within American education.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The necessity of elemental analysis techniques to solve forensic problems continues to expand as the samples collected from crime scenes grow in complexity. Laser ablation ICP-MS (LA-ICP-MS) has been shown to provide a high degree of discrimination between samples that originate from different sources. In the first part of this research, two laser ablation ICP-MS systems were compared, one using a nanosecond laser and another a femtosecond laser source for the forensic analysis of glass. The results showed that femtosecond LA-ICP-MS did not provide significant improvements in terms of accuracy, precision and discrimination, however femtosecond LA-ICP-MS did provide lower detection limits. In addition, it was determined that even for femtosecond LA-ICP-MS an internal standard should be utilized to obtain accurate analytical results for glass analyses. In the second part, a method using laser induced breakdown spectroscopy (LIBS) for the forensic analysis of glass was shown to provide excellent discrimination for a glass set consisting of 41 automotive fragments. The discrimination power was compared to two of the leading elemental analysis techniques, µXRF and LA-ICP-MS, and the results were similar; all methods generated >99% discrimination and the pairs found indistinguishable were similar. An extensive data analysis approach for LIBS glass analyses was developed to minimize Type I and II errors en route to a recommendation of 10 ratios to be used for glass comparisons. Finally, a LA-ICP-MS method for the qualitative analysis and discrimination of gel ink sources was developed and tested for a set of ink samples. In the first discrimination study, qualitative analysis was used to obtain 95.6% discrimination for a blind study consisting of 45 black gel ink samples provided by the United States Secret Service. A 0.4% false exclusion (Type I) error rate and a 3.9% false inclusion (Type II) error rate was obtained for this discrimination study. In the second discrimination study, 99% discrimination power was achieved for a black gel ink pen set consisting of 24 self collected samples. The two pairs found to be indistinguishable came from the same source of origin (the same manufacturer and type of pen purchased in different locations). It was also found that gel ink from the same pen, regardless of the age, was indistinguishable as were gel ink pens (four pens) originating from the same pack.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

During the SINOPS project, an optimal state of the art simulation of the marine silicon cycle is attempted employing a biogeochemical ocean general circulation model (BOGCM) through three particular time steps relevant for global (paleo-) climate. In order to tune the model optimally, results of the simulations are compared to a comprehensive data set of 'real' observations. SINOPS' scientific data management ensures that data structure becomes homogeneous throughout the project. Practical work routine comprises systematic progress from data acquisition, through preparation, processing, quality check and archiving, up to the presentation of data to the scientific community. Meta-information and analytical data are mapped by an n-dimensional catalogue in order to itemize the analytical value and to serve as an unambiguous identifier. In practice, data management is carried out by means of the online-accessible information system PANGAEA, which offers a tool set comprising a data warehouse, Graphical Information System (GIS), 2-D plot, cross-section plot, etc. and whose multidimensional data model promotes scientific data mining. Besides scientific and technical aspects, this alliance between scientific project team and data management crew serves to integrate the participants and allows them to gain mutual respect and appreciation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background There is increasing interest in how culture may affect the quality of healthcare services, and previous research has shown that ‘treatment culture’—of which there are three categories (resident centred, ambiguous and traditional)—in a nursing home may influence prescribing of psychoactive medications. Objective The objective of this study was to explore and understand treatment culture in prescribing of psychoactive medications for older people with dementia in nursing homes. Method Six nursing homes—two from each treatment culture category—participated in this study. Qualitative data were collected through semi-structured interviews with nursing home staff and general practitioners (GPs), which sought to determine participants’ views on prescribing and administration of psychoactive medication, and their understanding of treatment culture and its potential influence on prescribing of psychoactive drugs. Following verbatim transcription, the data were analysed and themes were identified, facilitated by NVivo and discussion within the research team. Results Interviews took place with five managers, seven nurses, 13 care assistants and two GPs. Four themes emerged: the characteristics of the setting, the characteristics of the individual, relationships and decision making. The characteristics of the setting were exemplified by views of the setting, daily routines and staff training. The characteristics of the individual were demonstrated by views on the personhood of residents and staff attitudes. Relationships varied between staff within and outside the home. These relationships appeared to influence decision making about prescribing of medications. The data analysis found that each home exhibited traits that were indicative of its respective assigned treatment culture. Conclusion Nursing home treatment culture appeared to be influenced by four main themes. Modification of these factors may lead to a shift in culture towards a more flexible, resident-centred culture and a reduction in prescribing and use of psychoactive medication. 

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Tide gauge data are identified as legacy data given the radical transition between observation method and required output format associated with tide gauges over the 20th-century. Observed water level variation through tide-gauge records is regarded as the only significant basis for determining recent historical variation (decade to century) in mean sea-level and storm surge. There are limited tide gauge records that cover the 20th century, such that the Belfast (UK) Harbour tide gauge would be a strategic long-term (110 years) record, if the full paper-based records (marigrams) were digitally restructured to allow for consistent data analysis. This paper presents the methodology of extracting a consistent time series of observed water levels from the 5 different Belfast Harbour tide gauges’ positions/machine types, starting late 1901. Tide-gauge data was digitally retrieved from the original analogue (daily) records by scanning the marigrams and then extracting the sequential tidal elevations with graph-line seeking software (Ungraph™). This automation of signal extraction allowed the full Belfast series to be retrieved quickly, relative to any manual x–y digitisation of the signal. Restructuring variably lengthed tidal data sets to a consistent daily, monthly and annual file format was undertaken by project-developed software: Merge&Convert and MergeHYD allow consistent water level sampling both at 60 min (past standard) and 10 min intervals, the latter enhancing surge measurement. Belfast tide-gauge data have been rectified, validated and quality controlled (IOC 2006 standards). The result is a consistent annual-based legacy data series for Belfast Harbour that includes over 2 million tidal-level data observations.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Syria has been a major producer and exporter of fresh fruit and vegetables (FFV) in the Arabic region. Prior to 2011, Syrian FFV were mainly exported to the neighbouring countries, the Gulf States and Northern Africa as well as to Eastern European countries. Although the EU is potentially one of the most profitable markets of high quality FFV (such as organic ones) in the world, Syrian exports of FFV to Western European countries like Germany have been small. It could be a lucrative opportunity for Syrian growers and exporters of FFV to export organic products to markets such as Germany, where national production is limited to a few months due to climatic conditions. Yet, the organic sector in Syria is comparatively young and only a very small area of FFV is certified according to EU organic regulations. Up to the author’s knowledge, little was known about Syrian farmers’ attitudes towards organic FFV production. There was also no study so far that explored and analysed the determining factors for organic FFV adoption among Syrian farmers as well as the exports of these products to the EU markets. The overarching aim of the present dissertation focused on exploring and identifying the market potential of Syrian exports of organic FFV to Germany. The dissertation was therefore concerned with three main objectives: (i) to explore if German importers and wholesalers of organic FFV see market opportunities for Syrian organic products and what requirements in terms of quality and quantity they have, (ii) to determine the obstacles Syrian producers and exporters face when exporting agricultural products to Germany, and (iii) to investigate whether Syrian farmers of FFV can imagine converting their farms to organic production as well as the underlying reasons why they do so or not. A twofold methodological approach with expert interviews and a farmer survey were used in this dissertation to address the abovementioned objectives. While expert interviews were conducted with German and Syrian wholesalers of (organic) FFV in 2011 (9 interviews each), the farmer survey was administrated with 266 Syrian farmers of FFV in the main region for the production of FFV (i.e. the coastal region) from November 2012 till May 2013. For modelling farmers’ decisions to adopt organic farming, the Theory of Planned Behaviour as theoretical framework and Partial Least Squares Structural Equation Modelling as the main method for data analysis were used in this study. The findings of this dissertation yield implications for the different stakeholders (governmental institutions and NGOs, farmers, exporters, wholesalers, etc.) who are interested in prompting the Syrian export of organic products. Based on the empirical results and a literature review, an action plan to promote Syrian production and export of organic products was developed which can help in the post-war period in Syria at improving the organic sector.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The Twitter System is the biggest social network in the world, and everyday millions of tweets are posted and talked about, expressing various views and opinions. A large variety of research activities have been conducted to study how the opinions can be clustered and analyzed, so that some tendencies can be uncovered. Due to the inherent weaknesses of the tweets - very short texts and very informal styles of writing - it is rather hard to make an investigation of tweet data analysis giving results with good performance and accuracy. In this paper, we intend to attack the problem from another aspect - using a two-layer structure to analyze the twitter data: LDA with topic map modelling. The experimental results demonstrate that this approach shows a progress in twitter data analysis. However, more experiments with this method are expected in order to ensure that the accurate analytic results can be maintained.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper is reviewing objective assessments of Parkinson’s disease(PD) motor symptoms, cardinal, and dyskinesia, using sensor systems. It surveys the manifestation of PD symptoms, sensors that were used for their detection, types of signals (measures) as well as their signal processing (data analysis) methods. A summary of this review’s finding is represented in a table including devices (sensors), measures and methods that were used in each reviewed motor symptom assessment study. In the gathered studies among sensors, accelerometers and touch screen devices are the most widely used to detect PD symptoms and among symptoms, bradykinesia and tremor were found to be mostly evaluated. In general, machine learning methods are potentially promising for this. PD is a complex disease that requires continuous monitoring and multidimensional symptom analysis. Combining existing technologies to develop new sensor platforms may assist in assessing the overall symptom profile more accurately to develop useful tools towards supporting better treatment process.