857 resultados para Mapping And Monitoring
Resumo:
Previous research suggests that many eating behaviours are stable in children but that obesigenic eating behaviours tend to increase with age. This research explores the stability (consistency in individual levels over time) and continuity (consistency in group levels over time) of child eating behaviours and parental feeding practices in children between 2 and 5 years of age. Thirty one participants completed measures of child eating behaviours, parental feeding practices and child weight at 2 and 5 years of age. Child eating behaviours and parental feeding practices remained stable between 2 and 5 years of age. There was also good continuity in measures of parental restriction and monitoring of food intake, as well as in mean levels of children's eating behaviours and BMI over time. Mean levels of maternal pressure to eat significantly increased, whilst mean levels of desire to drink significantly decreased, between 2 and 5 years of age. These findings suggest that children's eating behaviours are stable and continuous in the period prior to 5 years of age. Further research is necessary to replicate these findings and to explore why later developmental increases are seen in children's obesigenic eating behaviours. © 2011 Elsevier Ltd.
Resumo:
Question/Issue: We combine agency and institutional theory to explain the division of equity shares between the foreign (majority) and local (minority) partners within foreign affiliates. We posit that once the decision to invest is made, the ownership structure is arranged so as to generate appropriate incentives to local partners, taking into account both the institutional environment and the firm-specific difficulty in monitoring. Research Findings/Insights: Using a large firm-level dataset for the period 2003-2011 from 16 Central and Eastern European countries and applying selectivity corrected estimates, we find that both weaker host country institutions and higher share of intangible assets in total assets in the firm imply higher minority equity share of local partners. The findings hold when controlling for host country effects and when the attributes of the institutional environment are instrumented. Theoretical/Academic Implications: The classic view is that weak institutions lead to concentrated ownership, yet it leaves the level of minority equity shares unexplained. Our contribution uses a firm-level perspective combined with national-level variation in the institutional environment, and applies agency theory to explain the minority local partner share in foreign affiliates. In particular, we posit that the information asymmetry and monitoring problem in firms are exacerbated by weak host country institutions, but also by the higher share of intangible assets in total assets. Practitioner/Policy Implications: Assessing investment opportunities abroad, foreign firms need to pay attention not only to features directly related to corporate governance (e.g., bankruptcy codes) but also to the broad institutional environment. In weak institutional environments, foreign parent firms need to create strong incentives for local partners by offering them significant minority shares in equity. The same recommendation applies to firms with higher shares of intangible assets in total assets. © 2014 The Authors.
Resumo:
Energy dissipation and fatigue properties of nano-layered thin films are less well studied than bulk properties. Existing experimental methods for studying energy dissipation properties, typically using magnetic interaction as a driving force at different frequencies and a laser-based deformation measurement system, are difficult to apply to two-dimensional materials. We propose a novel experimental method to perform dynamic testing on thin-film materials by driving a cantilever specimen at its fixed end with a bimorph piezoelectric actuator and monitoring the displacements of the specimen and the actuator with a fibre-optic system. Upon vibration, the specimen is greatly affected by its inertia, and behaves as a cantilever beam under base excitation in translation. At resonance, this method resembles the vibrating reed method conventionally used in the viscoelasticity community. The loss tangent is obtained from both the width of a resonance peak and a free-decay process. As for fatigue measurement, we implement a control algorithm into LabView to maintain maximum displacement of the specimen during the course of the experiment. The fatigue S-N curves are obtained.
Resumo:
This thesis examines the ways Indonesian politicians exploit the rhetorical power of metaphors in the Indonesian political discourse. The research applies the Conceptual Metaphor Theory, Metaphorical Frame Analysis and Critical Discourse Analysis to textual and oral data. The corpus comprises: 150 political news articles from two newspapers (Harian Kompas and Harian Waspada, 2010-2011 edition), 30 recordings of two television news and talk-show programmes (TV-One and Metro-TV), and 20 interviews with four legislators, two educated persons and two laymen. For this study, a corpus of written bahasa Indonesia was also compiled, which comprises 150 texts of approximately 439,472 tokens. The data analysis shows the potential power of metaphors in relation to how politicians communicate the results of their thinking, reasoning and meaning-making through language and discourse and its social consequences. The data analysis firstly revealed 1155 metaphors. These metaphors were then classified into the categories of conventional metaphor, cognitive function of metaphor, metaphorical mapping and metaphor variation. The degree of conventionality of metaphors is established based on the sum of expressions in each group of metaphors. Secondly, the analysis revealed that metaphor variation is influenced by the broader Indonesian cultural context and the natural and physical environment, such as the social dimension, the regional, style and the individual. The mapping system of metaphor is unidirectionality. Thirdly, the data show that metaphoric thought pervades political discourse in relation to its uses as: (1) a felicitous tool for the rhetoric of political leaders, (2) part of meaning-making that keeps the discourse contexts alive and active, and (3) the degree to which metaphor and discourse shape the conceptual structures of politicians‟ rhetoric. Fourthly, the analysis of data revealed that the Indonesian political discourse attempts to create both distance and solidarity towards general and specific social categories accomplished via metaphorical and frame references to the conceptualisations of us/them. The result of the analysis shows that metaphor and frame are excellent indicators of the us/them categories which work dialectically in the discourse. The acts of categorisation via metaphors and frames at both textual and conceptual level activate asymmetrical concepts and contribute to social and political hierarchical constructs, i.e. WEAKNESS vs.POWER, STUDENT vs. TEACHER, GHOST vs. CHOSEN WARRIOR, and so on. This analysis underscores the dynamic nature of categories by documenting metaphorical transfers between, i.e. ENEMY, DISEASE, BUSINESS, MYSTERIOUS OBJECT and CORRUPTION, LAW, POLITICS and CASE. The metaphorical transfers showed that politicians try to dictate how they categorise each other in order to mobilise audiences to act on behalf of their ideologies and to create distance and solidarity.
Resumo:
Aim: To examine the use of image analysis to quantify changes in ocular physiology. Method: A purpose designed computer program was written to objectively quantify bulbar hyperaemia, tarsal redness, corneal staining and tarsal staining. Thresholding, colour extraction and edge detection paradigms were investigated. The repeatability (stability) of each technique to changes in image luminance was assessed. A clinical pictorial grading scale was analysed to examine the repeatability and validity of the chosen image analysis technique. Results: Edge detection using a 3 × 3 kernel was found to be the most stable to changes in image luminance (2.6% over a +60 to -90% luminance range) and correlated well with the CCLRU scale images of bulbar hyperaemia (r = 0.96), corneal staining (r = 0.85) and the staining of palpebral roughness (r = 0.96). Extraction of the red colour plane demonstrated the best correlation-sensitivity combination for palpebral hyperaemia (r = 0.96). Repeatability variability was <0.5%. Conclusions: Digital imaging, in conjunction with computerised image analysis, allows objective, clinically valid and repeatable quantification of ocular features. It offers the possibility of improved diagnosis and monitoring of changes in ocular physiology in clinical practice. © 2003 British Contact Lens Association. Published by Elsevier Science Ltd. All rights reserved.
Resumo:
Bladder cancer is among the most common cancers worldwide (4th in men). It is responsible for high patient morbidity and displays rapid recurrence and progression. Lack of sensitivity of gold standard techniques (white light cystoscopy, voided urine cytology) means many early treatable cases are missed. The result is a large number of advanced cases of bladder cancer which require extensive treatment and monitoring. For this reason, bladder cancer is the single most expensive cancer to treat on a per patient basis. In recent years, autofluorescence spectroscopy has begun to shed light into disease research. Of particular interest in cancer research are the fluorescent metabolic cofactors NADH and FAD. Early in tumour development, cancer cells often undergo a metabolic shift (the Warburg effect) resulting in increased NADH. The ratio of NADH to FAD ("redox ratio") can therefore be used as an indicator of the metabolic status of cells. Redox ratio measurements have been used to differentiate between healthy and cancer breast cells and to monitor cellular responses to therapies. Here, we have demonstrated, using healthy and bladder cancer cell lines, a statistically significant difference in the redox ratio of bladder cancer cells, indicative of a metabolic shift. To do this we customised a standard flow cytometer to excite and record fluorescence specifically from NADH and FAD, along with a method for automatically calculating the redox ratio of individual cells within large populations. These results could inform the design of novel probes and screening systems for the early detection of bladder cancer.
Resumo:
This paper deals with communicational breakdowns and misunderstandings in computer mediated communication (CMC) and ways to recover from them or to prevent them. The paper describes a case study of CMC conducted in a company named Artigiani. We observed communication and conducted content analysis of e-mail messages, focusing on message exchanges between customer service representatives (CSRs) and their contacts. In addition to task management difficulties, we identified communication breakdowns that result from differences between perspectives, and from the lack of contextual information, mainly technical background and professional jargon at the customers’ side. We examined possible ways to enhance CMC and accordingly designed a prototype for an e-mail user interface that emphasizes a communicational strategy called contextualization as a central component for obtaining effective communication and for supporting effective management and control of organizational activities, especially handling orders, price quoting, and monitoring the supply and installation of products.
Resumo:
ACM Computing Classification System (1998): J.3.
Resumo:
Due to dynamic variability, identifying the specific conditions under which non-functional requirements (NFRs) are satisfied may be only possible at runtime. Therefore, it is necessary to consider the dynamic treatment of relevant information during the requirements specifications. The associated data can be gathered by monitoring the execution of the application and its underlying environment to support reasoning about how the current application configuration is fulfilling the established requirements. This paper presents a dynamic decision-making infrastructure to support both NFRs representation and monitoring, and to reason about the degree of satisfaction of NFRs during runtime. The infrastructure is composed of: (i) an extended feature model aligned with a domain-specific language for representing NFRs to be monitored at runtime; (ii) a monitoring infrastructure to continuously assess NFRs at runtime; and (iii) a exible decision-making process to select the best available configuration based on the satisfaction degree of the NRFs. The evaluation of the approach has shown that it is able to choose application configurations that well fit user NFRs based on runtime information. The evaluation also revealed that the proposed infrastructure provided consistent indicators regarding the best application configurations that fit user NFRs. Finally, a benefit of our approach is that it allows us to quantify the level of satisfaction with respect to NFRs specification.
Resumo:
For the treatment and monitoring of Parkinson's disease (PD) to be scientific, a key requirement is that measurement of disease stages and severity is quantitative, reliable, and repeatable. The last 50 years in PD research have been dominated by qualitative, subjective ratings obtained by human interpretation of the presentation of disease signs and symptoms at clinical visits. More recently, “wearable,” sensor-based, quantitative, objective, and easy-to-use systems for quantifying PD signs for large numbers of participants over extended durations have been developed. This technology has the potential to significantly improve both clinical diagnosis and management in PD and the conduct of clinical studies. However, the large-scale, high-dimensional character of the data captured by these wearable sensors requires sophisticated signal processing and machine-learning algorithms to transform it into scientifically and clinically meaningful information. Such algorithms that “learn” from data have shown remarkable success in making accurate predictions for complex problems in which human skill has been required to date, but they are challenging to evaluate and apply without a basic understanding of the underlying logic on which they are based. This article contains a nontechnical tutorial review of relevant machine-learning algorithms, also describing their limitations and how these can be overcome. It discusses implications of this technology and a practical road map for realizing the full potential of this technology in PD research and practice. © 2016 International Parkinson and Movement Disorder Society.
Resumo:
To chronicle demographic movement across African Asian corridors, a variety of molecular (sequence analysis, restriction mapping and denaturing high performance liquid chromatography etc.) and statistical (correspondence analysis, AMOVA, calculation of diversity indices and phylogenetic inference, etc.) techniques were employed to assess the phylogeographic patterns of mtDNA control region and Y chromosomal variation among 14 sub-Saharan, North African and Middle Eastern populations. The patterns of genetic diversity revealed evidence of multiple migrations across several African Asian passageways as well within the African continent itself. The two-part analysis uncovered several interesting results which include the following: (1) a north (Egypt and Middle East Asia) to south (sub-Saharan Africa) partitioning of both mtDNA and Y chromosomal haplogroup diversity, (2) a genetic diversity gradient in sub-Saharan Africa from east to west, (3) evidence in favor of the Levantine Corridor over the Horn of Africa as the major genetic conduit since the Last Glacial Maximum, (4) a substantially higher mtDNA versus Y chromosomal sub-Saharan component in the Middle East collections, (5) a higher representation of East versus West African mtDNA haplotypes in the Arabian Peninsula populations versus no such bias in the Levant groups and lastly, (6) genetic remnants of the Bantu demographic expansion in sub-Saharan Africa. ^
Resumo:
Deception research has traditionally focused on three methods of identifying liars and truth tellers: observing non-verbal or behavioral cues, analyzing verbal cues, and monitoring changes in physiological arousal during polygraph tests. Research shows that observers are often incapable of discriminating between liars and truth tellers with better than chance accuracy when they use these methods. One possible explanation for observers' poor performance is that they are not properly applying existing lie detection methods. An alternative explanation is that the cues on which these methods — and observers' judgments — are based do not reliably discriminate between liars and truth tellers. It may be possible to identify more reliable cues, and potentially improve observers' ability to discriminate, by developing a better understanding of how liars and truth tellers try to tell a convincing story. ^ This research examined (a) the verbal strategies used by truthful and deceptive individuals during interviews concerning an assigned activity, and (b) observers' ability to discriminate between them based on their verbal strategies. In Experiment I, pre-interview instructions manipulated participants' expectations regarding verifiability; each participant was led to believe that the interviewer could check some types of details, but not others, before deciding whether the participant was being truthful or deceptive. Interviews were then transcribed and scored for quantity and type of information provided. In Experiment II, observers listened to a random sample of the Experiment I interviews and rendered veracity judgments; half of the observers were instructed to judge the interviews according to the verbal strategies used by liars and truth tellers and the other half were uninstructed. ^ Results of Experiment I indicate that liars and truth tellers use different verbal strategies, characterized by a differential amount of detail. Overall, truthful participants provided more information than deceptive participants. This effect was moderated by participants' expectations regarding verifiability such that truthful participants provided more information only with regard to verifiable details. Results of Experiment II indicate that observers instructed about liars' and truth tellers' verbal strategies identify them with greater accuracy than uninstructed observers. ^
Resumo:
Over the past 200 years, an estimated 53% (about 47 million ha) of the original wetlands in the conterminous United States have been lost, mainly as a result of various human activities. Despite the importance of wetlands (particularly along the coast), and a longstanding federal policy framework meant to protect their integrity, the cumulative impact on these natural systems over large areas is poorly understood. We address this lack of research by mapping and conducting descriptive spatial analyses of federal wetland alteration permits (pursuant to section 404 of the Clean Water Act) across 85 watersheds in Florida and coastal Texas from 1991 to 2003. Results show that more than half of the permits issued in both states (60%) fell under the Nationwide permitting category. Permits issued in Texas were typically located outside of urban areas (78%) and outside 100-year floodplains (61%). More than half of permits issued in Florida were within urban areas (57%) and outside of 100-year floodplains (51%). The most affected wetlands types were estuarine in Texas (47%) and palustrine in Florida (55%). We expect that an additional outcome of this work will be an increased awareness of the cumulative depletion of wetlands and loss of ecological services in these urbanized areas, perhaps leading to increased conservation efforts.
Resumo:
An automated on-line SPE-LC-MS/MS method was developed for the quantitation of multiple classes of antibiotics in environmental waters. High sensitivity in the low ng/L range was accomplished by using large volume injections with 10-mL of sample. Positive confirmation of analytes was achieved using two selected reaction monitoring (SRM) transitions per antibiotic and quantitation was performed using an internal standard approach. Samples were extracted using online solid phase extraction, then using column switching technique; extracted samples were immediately passed through liquid chromatography and analyzed by tandem mass spectrometry. The total run time per each sample was 20 min. The statistically calculated method detection limits for various environmental samples were between 1.2 and 63 ng/L. Furthermore, the method was validated in terms of precision, accuracy and linearity. ^ The developed analytical methodology was used to measure the occurrence of antibiotics in reclaimed waters (n=56), surface waters (n=53), ground waters (n=8) and drinking waters (n=54) collected from different parts of South Florida. In reclaimed waters, the most frequently detected antibiotics were nalidixic acid, erythromycin, clarithromycin, azithromycin trimethoprim, sulfamethoxazole and ofloxacin (19.3-604.9 ng/L). Detection of antibiotics in reclaimed waters indicates that they can't be completely removed by conventional wastewater treatment process. Furthermore, the average mass loads of antibiotics released into the local environment through reclaimed water were estimated as 0.248 Kg/day. Among the surface waters samples, Miami River (reaching up to 580 ng/L) and Black Creek canal (up to 124 ng/L) showed highest concentrations of antibiotics. No traces of antibiotics were found in ground waters. On the other hand, erythromycin (monitored as anhydro erythromycin) was detected in 82% of the drinking water samples (n.d-66 ng/L). The developed approach is suitable for both research and monitoring applications.^ Major metabolites of antibiotics in reclaimed wates were identified and quantified using high resolution benchtop Q-Exactive orbitrap mass spectrometer. A phase I metabolite of erythromycin was tentatively identified in full scan based on accurate mass measurement. Using extracted ion chromatogram (XIC), high resolution data-dependent MS/MS spectra and metabolic profiling software the metabolite was identified as desmethyl anhydro erythromycin with molecular formula C36H63NO12 and m/z 702.4423. The molar concentration of the metabolite to erythromycin was in the order of 13 %. To my knowledge, this is the first known report on this metabolite in reclaimed water. Another compound acetyl-sulfamethoxazole, a phase II metabolite of sulfamethoxazole was also identified in reclaimed water and mole fraction of the metabolite represent 36 %, of the cumulative sulfamethoxazole concentration. The results were illustrating the importance to include metabolites also in the routine analysis to obtain a mass balance for better understanding of the occurrence, fate and distribution of antibiotics in the environment. ^ Finally, all the antibiotics detected in reclaimed and surface waters were investigated to assess the potential risk to the aquatic organisms. The surface water antibiotic concentrations that represented the real time exposure conditions revealed that the macrolide antibiotics, erythromycin, clarithromycin and tylosin along with quinolone antibiotic, ciprofloxacin were suspected to induce high toxicity to aquatic biota. Preliminary results showing that, among the antibiotic groups tested, macrolides posed the highest ecological threat, and therefore, they may need to be further evaluated with, long-term exposure studies considering bioaccumulation factors and more number of species selected. Overall, the occurrence of antibiotics in aquatic environment is posing an ecological health concern.^
Resumo:
Established as a National Park in 1980, Biscayne National Park (BISC) comprises an area of nearly 700 km2 , of which most is under water. The terrestrial portions of BISC include a coastal strip on the south Florida mainland and a set of Key Largo limestone barrier islands which parallel the mainland several kilometers offshore and define the eastern rim of Biscayne Bay. The upland vegetation component of BISC is embedded within an extensive coastal wetland network, including an archipelago of 42 mangrove-dominated islands with extensive areas of tropical hardwood forests or hammocks. Several databases and vegetation maps describe these terrestrial communities. However, these sources are, for the most part, outdated, incomplete, incompatible, or/and inaccurate. For example, the current, Welch et al. (1999), vegetation map of BISC is nearly 10 years old and represents the conditions of Biscayne National Park shortly after Hurricane Andrew (August 24, 1992). As a result, a new terrestrial vegetation map was commissioned by The National Park Service Inventory and Monitoring Program South Florida / Caribbean Network.