922 resultados para Movement Data Analysis
Resumo:
Housing Partnerships (HPs) are collaborative arrangements that assist communities in the delivery of affordable housing by combining the strengths of the public and private sectors. They emerged in several states, counties, and cities in the eighties as innovative solutions to the challenges in affordable housing resulting from changing dynamics of delivery and production. ^ My study examines HPs with particular emphasis upon the identification of those factors associated with the successful performance of their mission of affordable housing. I will use the Balanced Scorecard (BSC) framework in this study. The identification of performance factors facilitates a better understanding of how HPs can be successful in achieving their mission. The identification of performance factors is significant in the context of the current economic environment because HPs can be viewed as innovative institutional mechanisms in the provision of affordable housing. ^ The present study uses a mixed methods research approach, drawing on data from the IRS Form 990 tax returns, a survey of the chief executives of HPs, and other secondary sources. The data analysis is framed according to the four perspectives of BSC: the financial, customer, internal business, and learning and growth. Financially, revenue diversification affects the financial health of HPs and overall performance. Although HPs depend on private and government funding, they also depend on service fees to carry out their mission. From a customer perspective, the HPs mainly serve low and moderate income households, although some serve specific groups such as seniors, homeless, veterans, and victims of domestic violence. From an internal business perspective, HPs’ programs are oriented toward affordable housing needs, undertaking not only traditional activities such as construction, loan provision, etc., but also advocacy and educational programs. From an employee and learning growth perspective, the HPs are small in staff size, but undertake a range of activities with the help of volunteers. Every part of the HP is developed to maximize resources, knowledge, and skills in order to assist communities in the delivery of affordable housing and related needs. Overall, housing partnerships have played a key role in affordable housing despite the housing market downturn since 2006. Their expenses on affordable housing activities increased despite the decrease in their revenues.^
Resumo:
Understanding habitat selection and movement remains a key question in behavioral ecology. Yet, obtaining a sufficiently high spatiotemporal resolution of the movement paths of organisms remains a major challenge, despite recent technological advances. Observing fine-scale movement and habitat choice decisions in the field can prove to be difficult and expensive, particularly in expansive habitats such as wetlands. We describe the application of passive integrated transponder (PIT) systems to field enclosures for tracking detailed fish behaviors in an experimental setting. PIT systems have been applied to habitats with clear passageways, at fixed locations or in controlled laboratory and mesocosm settings, but their use in unconfined habitats and field-based experimental setups remains limited. In an Everglades enclosure, we continuously tracked the movement and habitat use of PIT-tagged centrarchids across three habitats of varying depth and complexity using multiple flatbed antennas for 14 days. Fish used all three habitats, with marked species-specific diel movement patterns across habitats, and short-lived movements that would be likely missed by other tracking techniques. Findings suggest that the application of PIT systems to field enclosures can be an insightful approach for gaining continuous, undisturbed and detailed movement data in unconfined habitats, and for experimentally manipulating both internal and external drivers of these behaviors.
Resumo:
Advanced Placement is a series of courses and tests designed to determine mastery over introductory college material. It has become part of the American educational system. The changing conception of AP was examined using critical theory to determine what led to a view of continual success. The study utilized David Armstrong's variation of Michel Foucault's critical theory to construct an analytical framework. Black and Ubbes' data gathering techniques and Braun and Clark's data analysis were utilized as the analytical framework. Data included 1135 documents: 641 journal articles, 421 newspaper articles and 82 government documents. ^ The study revealed three historical ruptures correlated to three themes containing subthemes. The first rupture was the Sputnik launch in 1958. Its correlated theme was AP leading to school reform with subthemes of AP as reform for able students and AP's gaining of acceptance from secondary schools and higher education. The second rupture was the Nation at Risk report published in 1983. Its correlated theme was AP's shift in emphasis from the exam to the course with the subthemes of AP as a course, a shift in AP's target population, using AP courses to promote equity, and AP courses modifying curricula. The passage of the No Child Left Behind Act of 2001 was the third rupture. Its correlated theme was AP as a means to narrow the achievement gap with the subthemes of AP as a college preparatory program and the shifting of AP to an open access program. ^ The themes revealed a perception that progressively integrated the program into American education. The AP program changed emphasis from tests to curriculum, and is seen as the nation's premier academic program to promote reform and prepare students for college. It has become a major source of income for the College Board. In effect, AP has become an agent of privatization, spurring other private entities into competition for government funding. The change and growth of the program over the past 57 years resulted in a deep integration into American education. As such the program remains an intrinsic part of the system and continues to evolve within American education. ^
Resumo:
Women are a high-risk population for cardiovascular diseases (CVD); however relationships between CVD and subpopulations of mothers are sparse. A secondary data analysis of the 2006 Health Survey of Adults and Children in Bermuda was conducted to compare the prevalence of CVD risk factors in single (n=77) and partnered (n=241) mothers. A higher percentage of single mothers were Black (p25 kg/m2 (p=0.01) and reported high blood pressure (p=0.004) and high cholesterol (0.017). Single mothers were nearly three times (OR=2.66) more likely to experience high blood pressure and two times (OR= 2.22) more likely to have high cholesterol. Single mothers may benefit from nutrition education programs related to lowering CVD risk.
Resumo:
Textual composition in classroom has been object of research in language studies along this last three decades in Brazil. This thematic recurrence occurs is a demonstration of the gap between writing skill teaching and learner‟s performance. In this research, we argue that during writing process in classroom, teachers‟ mediated actions guide students to the exotopic exercise on their texts, facing it as a fundamental phase of their composition, with meaningful effect for the development of textual authorship. In this sense, we have chosen as investigation focus the textual composition of Letters Students at Universidade do Estado do Rio Grande do Norte – UERN - to study writing processual characteristics, based on teacher‟s mediation. The main aim of this research is to analyze students‟ (re)writing along Letters Course, to comprehend the process of authorship construction in their texts and the effects resulted through teacher mediation in this process. More specifically, a) to analyze teacher mediation as a mechanism for authorship development in texts composed by Letters Students; b) to deduce, based on different versions of textual composition, the effects of teacher mediation on students‟ writing; and c) to describe compositional textual process in classroom, identifying students‟ attitudes/behaviors before writing task. We have brought several voices into the dialogue, among them we highlight those based on bakhtinian studies. Some of those authors are related to Bakhtin circle, by themselves (BAKHTIN/VOLOCHINOV, [1929] 2006; BAKHTIN, [1979] 2003; [1963] 2008; [1975] 2010a; [1986] 2010b), their debaters (FARACO, 2008, 2009a, 2009b, 2010; PONZIO, 2010, 2012; GERALDI, 2010a; 2010b OLIVEIRA, 2006, 2008a, 2008b, 2010, among others), to guide us, mainly, on dialogism, author and authorship, and their conceptual implications: exotopy, finishing, esthetic activity, and ethical act. Data was constituted in teaching situation, involving teacher/researcher and 5th Term Letters/UERN students. Therefore, we have submitted an open questionnaire, textual discussion, and an article (re)writing. Data analysis has revealed subjects‟ little experience with writing composition in the Course, as a systematic practice, in their routine, dialogued, whose social function is explored. The texts are generally written in a single version and useful only to receive a score. Data analysis show insecure students in relation the writing, and with many difficulties to do it. On the other hand, writing movements, on the analyzed articles, have revealed that the subjects show a responsive attitude in relation to the mediated activities, to respond rewriting proposal. Despite some problems remain unsolved and many others emerge in each version of the article, in general, we consider that teacher mediation had a positive effect on student writing, considering that it boosted the author exotopic movement, something indispensable to compose a text. The three interventions carried out, in some way, provided opportunity for the subjects to modify their article.
Resumo:
Textual composition in classroom has been object of research in language studies along this last three decades in Brazil. This thematic recurrence occurs is a demonstration of the gap between writing skill teaching and learner‟s performance. In this research, we argue that during writing process in classroom, teachers‟ mediated actions guide students to the exotopic exercise on their texts, facing it as a fundamental phase of their composition, with meaningful effect for the development of textual authorship. In this sense, we have chosen as investigation focus the textual composition of Letters Students at Universidade do Estado do Rio Grande do Norte – UERN - to study writing processual characteristics, based on teacher‟s mediation. The main aim of this research is to analyze students‟ (re)writing along Letters Course, to comprehend the process of authorship construction in their texts and the effects resulted through teacher mediation in this process. More specifically, a) to analyze teacher mediation as a mechanism for authorship development in texts composed by Letters Students; b) to deduce, based on different versions of textual composition, the effects of teacher mediation on students‟ writing; and c) to describe compositional textual process in classroom, identifying students‟ attitudes/behaviors before writing task. We have brought several voices into the dialogue, among them we highlight those based on bakhtinian studies. Some of those authors are related to Bakhtin circle, by themselves (BAKHTIN/VOLOCHINOV, [1929] 2006; BAKHTIN, [1979] 2003; [1963] 2008; [1975] 2010a; [1986] 2010b), their debaters (FARACO, 2008, 2009a, 2009b, 2010; PONZIO, 2010, 2012; GERALDI, 2010a; 2010b OLIVEIRA, 2006, 2008a, 2008b, 2010, among others), to guide us, mainly, on dialogism, author and authorship, and their conceptual implications: exotopy, finishing, esthetic activity, and ethical act. Data was constituted in teaching situation, involving teacher/researcher and 5th Term Letters/UERN students. Therefore, we have submitted an open questionnaire, textual discussion, and an article (re)writing. Data analysis has revealed subjects‟ little experience with writing composition in the Course, as a systematic practice, in their routine, dialogued, whose social function is explored. The texts are generally written in a single version and useful only to receive a score. Data analysis show insecure students in relation the writing, and with many difficulties to do it. On the other hand, writing movements, on the analyzed articles, have revealed that the subjects show a responsive attitude in relation to the mediated activities, to respond rewriting proposal. Despite some problems remain unsolved and many others emerge in each version of the article, in general, we consider that teacher mediation had a positive effect on student writing, considering that it boosted the author exotopic movement, something indispensable to compose a text. The three interventions carried out, in some way, provided opportunity for the subjects to modify their article.
Resumo:
The objective of this study was to characterize the structural-geophysical expression of the Transbrasiliano Lineament (TBL) in the east-central portion of the Parnaíba Basin. The TBL corresponds to a major Neoproterozoic NE-trending shear zone related to the Brasiliano orogenic cycle, with dextral strike-slip kinematics, underlying (but also laterally exposed in the NE and SW basin edges) the sedimentary section of the Parnaíba Basin. In this study, the interpretation of gravity and magnetic anomaly maps is consistent with the TBL kinematics, the signature of the geophysical anomalies corresponding to the high (plastic behaviour) and subsequent declining temperature (ductile to brittle behaviour) stages during Brasiliano and late Brasiliano times. The pattern of residual gravity anomalies is compatible with an S-C dextral pair shaping the geological bodies of an heterogeneous basement, such as slices of gneisses and granulites (positive anomalies), granitic and low-medium grade metasedimentary rocks (negative anomalies). Such anomalies curvilinear trends, ranging from NNE (interpreted as S surfaces) to NE (C surfaces), correspond to flattening surfaces (S), while the NE rectilinear trend must represent a C band. The narrower magnetic anomalies also display NNE to NE (S surfaces) trends and should correspond to similar (although narrower and more discontinuous) sources in the equivalent anomaly patterns. Pre-Silurian pull-apart style grabens may contribute to the NE negative gravimetric anomalies, although this interpretation demands control by seismic data analysis. On the other hand, the curvilinear anomalies associated to contractional trends are incompatible with their interpretation as pre-Silurian graben, in both maps. In the (reduced to the pole) magnetic anomalies map, most of these are again associated to low-temperature shear zones (C planes) and faults, juxtaposing distinct blocks in terms of magnetic properties, or eventually filled with basic bodies. It is also possible that some isolated magnetic anomalies correspond to igneous bodies of late-Brasiliano or Mesozoic age. The basement late discontinuities pattern can be interpreted in analogy to the Riedel fractures model, with steep dipping surfaces and a sub-horizontal movement section. This study also explored 2D gravity modeling controlled by the interpretation of a dip seismic line as regards to the Transbrasiliano Lineament. The rock section equivalent to the Jaibaras Group occupying a graben structure (as identified in the seismic line) corresponds to a discrete negative anomaly superimposed to a gravimetric high, once again indicating a stronger influence of older crystalline basement rocks as gravimetric sources, mainly reflecting the heterogeneities and anisotropies generated at high temperature conditions and their subsequent cooling along the TBL, during the Brasiliano cycle.
Resumo:
The objective of this study was to characterize the structural-geophysical expression of the Transbrasiliano Lineament (TBL) in the east-central portion of the Parnaíba Basin. The TBL corresponds to a major Neoproterozoic NE-trending shear zone related to the Brasiliano orogenic cycle, with dextral strike-slip kinematics, underlying (but also laterally exposed in the NE and SW basin edges) the sedimentary section of the Parnaíba Basin. In this study, the interpretation of gravity and magnetic anomaly maps is consistent with the TBL kinematics, the signature of the geophysical anomalies corresponding to the high (plastic behaviour) and subsequent declining temperature (ductile to brittle behaviour) stages during Brasiliano and late Brasiliano times. The pattern of residual gravity anomalies is compatible with an S-C dextral pair shaping the geological bodies of an heterogeneous basement, such as slices of gneisses and granulites (positive anomalies), granitic and low-medium grade metasedimentary rocks (negative anomalies). Such anomalies curvilinear trends, ranging from NNE (interpreted as S surfaces) to NE (C surfaces), correspond to flattening surfaces (S), while the NE rectilinear trend must represent a C band. The narrower magnetic anomalies also display NNE to NE (S surfaces) trends and should correspond to similar (although narrower and more discontinuous) sources in the equivalent anomaly patterns. Pre-Silurian pull-apart style grabens may contribute to the NE negative gravimetric anomalies, although this interpretation demands control by seismic data analysis. On the other hand, the curvilinear anomalies associated to contractional trends are incompatible with their interpretation as pre-Silurian graben, in both maps. In the (reduced to the pole) magnetic anomalies map, most of these are again associated to low-temperature shear zones (C planes) and faults, juxtaposing distinct blocks in terms of magnetic properties, or eventually filled with basic bodies. It is also possible that some isolated magnetic anomalies correspond to igneous bodies of late-Brasiliano or Mesozoic age. The basement late discontinuities pattern can be interpreted in analogy to the Riedel fractures model, with steep dipping surfaces and a sub-horizontal movement section. This study also explored 2D gravity modeling controlled by the interpretation of a dip seismic line as regards to the Transbrasiliano Lineament. The rock section equivalent to the Jaibaras Group occupying a graben structure (as identified in the seismic line) corresponds to a discrete negative anomaly superimposed to a gravimetric high, once again indicating a stronger influence of older crystalline basement rocks as gravimetric sources, mainly reflecting the heterogeneities and anisotropies generated at high temperature conditions and their subsequent cooling along the TBL, during the Brasiliano cycle.
Resumo:
Maternity nursing practice is changing across Canada with the movement toward becoming “baby friendly.” The World Health Organization (WHO) recommends the Baby-Friendly Hospital Initiative (BFHI) as a standard of care in hospitals worldwide. Very little research has been conducted with nurses to explore the impact of the initiative on nursing practice. The purpose of this study, therefore, was to examine the process of implementing the BFHI for nurses. The study was carried out using Corbin and Strauss’s method of grounded theory. Theoretical sampling was employed, which resulted in recruiting and interviewing 13 registered nurses whose area of employment included neonatal intensive care, postpartum, and labour and delivery. The data analysis revealed a central category of resisting the BFHI. All of the nurses disagreed with some of the 10 steps to becoming a baby-friendly hospital as outlined by the WHO. Participants questioned the science and safety of aspects of the BFHI. Also, participants indicated that the implementation of this program did not substantially change their nursing practice. They empathized with new mothers and anticipated being collectively reprimanded by management should they not follow the initiative. Five conditions influenced their responses to the initiative, which were (a) an awareness of a pro-breastfeeding culture, (b) imposition of the BFHI, (c) knowledge of the health benefits of breastfeeding, (d) experiential knowledge of infant feeding, and (e) the belief in the autonomy of mothers to decide about infant feeding. The identified outcomes were moral distress and division between nurses. The study findings could guide decision making concerning the implementation of the BFHI.
Resumo:
Thermodynamic stability measurements on proteins and protein-ligand complexes can offer insights not only into the fundamental properties of protein folding reactions and protein functions, but also into the development of protein-directed therapeutic agents to combat disease. Conventional calorimetric or spectroscopic approaches for measuring protein stability typically require large amounts of purified protein. This requirement has precluded their use in proteomic applications. Stability of Proteins from Rates of Oxidation (SPROX) is a recently developed mass spectrometry-based approach for proteome-wide thermodynamic stability analysis. Since the proteomic coverage of SPROX is fundamentally limited by the detection of methionine-containing peptides, the use of tryptophan-containing peptides was investigated in this dissertation. A new SPROX-like protocol was developed that measured protein folding free energies using the denaturant dependence of the rate at which globally protected tryptophan and methionine residues are modified with dimethyl (2-hydroxyl-5-nitrobenzyl) sulfonium bromide and hydrogen peroxide, respectively. This so-called Hybrid protocol was applied to proteins in yeast and MCF-7 cell lysates and achieved a ~50% increase in proteomic coverage compared to probing only methionine-containing peptides. Subsequently, the Hybrid protocol was successfully utilized to identify and quantify both known and novel protein-ligand interactions in cell lysates. The ligands under study included the well-known Hsp90 inhibitor geldanamycin and the less well-understood omeprazole sulfide that inhibits liver-stage malaria. In addition to protein-small molecule interactions, protein-protein interactions involving Puf6 were investigated using the SPROX technique in comparative thermodynamic analyses performed on wild-type and Puf6-deletion yeast strains. A total of 39 proteins were detected as Puf6 targets and 36 of these targets were previously unknown to interact with Puf6. Finally, to facilitate the SPROX/Hybrid data analysis process and minimize human errors, a Bayesian algorithm was developed for transition midpoint assignment. In summary, the work in this dissertation expanded the scope of SPROX and evaluated the use of SPROX/Hybrid protocols for characterizing protein-ligand interactions in complex biological mixtures.
Resumo:
Energy efficiency and user comfort have recently become priorities in the Facility Management (FM) sector. This has resulted in the use of innovative building components, such as thermal solar panels, heat pumps, etc., as they have potential to provide better performance, energy savings and increased user comfort. However, as the complexity of components increases, the requirement for maintenance management also increases. The standard routine for building maintenance is inspection which results in repairs or replacement when a fault is found. This routine leads to unnecessary inspections which have a cost with respect to downtime of a component and work hours. This research proposes an alternative routine: performing building maintenance at the point in time when the component is degrading and requires maintenance, thus reducing the frequency of unnecessary inspections. This thesis demonstrates that statistical techniques can be used as part of a maintenance management methodology to invoke maintenance before failure occurs. The proposed FM process is presented through a scenario utilising current Building Information Modelling (BIM) technology and innovative contractual and organisational models. This FM scenario supports a Degradation based Maintenance (DbM) scheduling methodology, implemented using two statistical techniques, Particle Filters (PFs) and Gaussian Processes (GPs). DbM consists of extracting and tracking a degradation metric for a component. Limits for the degradation metric are identified based on one of a number of proposed processes. These processes determine the limits based on the maturity of the historical information available. DbM is implemented for three case study components: a heat exchanger; a heat pump; and a set of bearings. The identified degradation points for each case study, from a PF, a GP and a hybrid (PF and GP combined) DbM implementation are assessed against known degradation points. The GP implementations are successful for all components. For the PF implementations, the results presented in this thesis find that the extracted metrics and limits identify degradation occurrences accurately for components which are in continuous operation. For components which have seasonal operational periods, the PF may wrongly identify degradation. The GP performs more robustly than the PF, but the PF, on average, results in fewer false positives. The hybrid implementations, which are a combination of GP and PF results, are successful for 2 of 3 case studies and are not affected by seasonal data. Overall, DbM is effectively applied for the three case study components. The accuracy of the implementations is dependant on the relationships modelled by the PF and GP, and on the type and quantity of data available. This novel maintenance process can improve equipment performance and reduce energy wastage from BSCs operation.
Resumo:
Advanced Placement is a series of courses and tests designed to determine mastery over introductory college material. It has become part of the American educational system. The changing conception of AP was examined using critical theory to determine what led to a view of continual success. The study utilized David Armstrong’s variation of Michel Foucault’s critical theory to construct an analytical framework. Black and Ubbes’ data gathering techniques and Braun and Clark’s data analysis were utilized as the analytical framework. Data included 1135 documents: 641 journal articles, 421 newspaper articles and 82 government documents. The study revealed three historical ruptures correlated to three themes containing subthemes. The first rupture was the Sputnik launch in 1958. Its correlated theme was AP leading to school reform with subthemes of AP as reform for able students and AP’s gaining of acceptance from secondary schools and higher education. The second rupture was the Nation at Risk report published in 1983. Its correlated theme was AP’s shift in emphasis from the exam to the course with the subthemes of AP as a course, a shift in AP’s target population, using AP courses to promote equity, and AP courses modifying curricula. The passage of the No Child Left Behind Act of 2001 was the third rupture. Its correlated theme was AP as a means to narrow the achievement gap with the subthemes of AP as a college preparatory program and the shifting of AP to an open access program. The themes revealed a perception that progressively integrated the program into American education. The AP program changed emphasis from tests to curriculum, and is seen as the nation’s premier academic program to promote reform and prepare students for college. It has become a major source of income for the College Board. In effect, AP has become an agent of privatization, spurring other private entities into competition for government funding. The change and growth of the program over the past 57 years resulted in a deep integration into American education. As such the program remains an intrinsic part of the system and continues to evolve within American education.
Resumo:
The necessity of elemental analysis techniques to solve forensic problems continues to expand as the samples collected from crime scenes grow in complexity. Laser ablation ICP-MS (LA-ICP-MS) has been shown to provide a high degree of discrimination between samples that originate from different sources. In the first part of this research, two laser ablation ICP-MS systems were compared, one using a nanosecond laser and another a femtosecond laser source for the forensic analysis of glass. The results showed that femtosecond LA-ICP-MS did not provide significant improvements in terms of accuracy, precision and discrimination, however femtosecond LA-ICP-MS did provide lower detection limits. In addition, it was determined that even for femtosecond LA-ICP-MS an internal standard should be utilized to obtain accurate analytical results for glass analyses. In the second part, a method using laser induced breakdown spectroscopy (LIBS) for the forensic analysis of glass was shown to provide excellent discrimination for a glass set consisting of 41 automotive fragments. The discrimination power was compared to two of the leading elemental analysis techniques, µXRF and LA-ICP-MS, and the results were similar; all methods generated >99% discrimination and the pairs found indistinguishable were similar. An extensive data analysis approach for LIBS glass analyses was developed to minimize Type I and II errors en route to a recommendation of 10 ratios to be used for glass comparisons. Finally, a LA-ICP-MS method for the qualitative analysis and discrimination of gel ink sources was developed and tested for a set of ink samples. In the first discrimination study, qualitative analysis was used to obtain 95.6% discrimination for a blind study consisting of 45 black gel ink samples provided by the United States Secret Service. A 0.4% false exclusion (Type I) error rate and a 3.9% false inclusion (Type II) error rate was obtained for this discrimination study. In the second discrimination study, 99% discrimination power was achieved for a black gel ink pen set consisting of 24 self collected samples. The two pairs found to be indistinguishable came from the same source of origin (the same manufacturer and type of pen purchased in different locations). It was also found that gel ink from the same pen, regardless of the age, was indistinguishable as were gel ink pens (four pens) originating from the same pack.
Resumo:
Background There is increasing interest in how culture may affect the quality of healthcare services, and previous research has shown that ‘treatment culture’—of which there are three categories (resident centred, ambiguous and traditional)—in a nursing home may influence prescribing of psychoactive medications. Objective The objective of this study was to explore and understand treatment culture in prescribing of psychoactive medications for older people with dementia in nursing homes. Method Six nursing homes—two from each treatment culture category—participated in this study. Qualitative data were collected through semi-structured interviews with nursing home staff and general practitioners (GPs), which sought to determine participants’ views on prescribing and administration of psychoactive medication, and their understanding of treatment culture and its potential influence on prescribing of psychoactive drugs. Following verbatim transcription, the data were analysed and themes were identified, facilitated by NVivo and discussion within the research team. Results Interviews took place with five managers, seven nurses, 13 care assistants and two GPs. Four themes emerged: the characteristics of the setting, the characteristics of the individual, relationships and decision making. The characteristics of the setting were exemplified by views of the setting, daily routines and staff training. The characteristics of the individual were demonstrated by views on the personhood of residents and staff attitudes. Relationships varied between staff within and outside the home. These relationships appeared to influence decision making about prescribing of medications. The data analysis found that each home exhibited traits that were indicative of its respective assigned treatment culture. Conclusion Nursing home treatment culture appeared to be influenced by four main themes. Modification of these factors may lead to a shift in culture towards a more flexible, resident-centred culture and a reduction in prescribing and use of psychoactive medication.
Resumo:
Tide gauge data are identified as legacy data given the radical transition between observation method and required output format associated with tide gauges over the 20th-century. Observed water level variation through tide-gauge records is regarded as the only significant basis for determining recent historical variation (decade to century) in mean sea-level and storm surge. There are limited tide gauge records that cover the 20th century, such that the Belfast (UK) Harbour tide gauge would be a strategic long-term (110 years) record, if the full paper-based records (marigrams) were digitally restructured to allow for consistent data analysis. This paper presents the methodology of extracting a consistent time series of observed water levels from the 5 different Belfast Harbour tide gauges’ positions/machine types, starting late 1901. Tide-gauge data was digitally retrieved from the original analogue (daily) records by scanning the marigrams and then extracting the sequential tidal elevations with graph-line seeking software (Ungraph™). This automation of signal extraction allowed the full Belfast series to be retrieved quickly, relative to any manual x–y digitisation of the signal. Restructuring variably lengthed tidal data sets to a consistent daily, monthly and annual file format was undertaken by project-developed software: Merge&Convert and MergeHYD allow consistent water level sampling both at 60 min (past standard) and 10 min intervals, the latter enhancing surge measurement. Belfast tide-gauge data have been rectified, validated and quality controlled (IOC 2006 standards). The result is a consistent annual-based legacy data series for Belfast Harbour that includes over 2 million tidal-level data observations.