923 resultados para secondary structure analysis


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Freshwater is extremely precious; but even more precious than freshwater is clean freshwater. From the time that 2/3 of our planet is covered in water, we have contaminated our globe with chemicals that have been used by industrial activities over the last century in a unprecedented way causing harm to humans and wildlife. We have to adopt a new scientific mindset in order to face this problem so to protect this important resource. The Water Framework Directive (European Parliament and the Council, 2000) is a milestone legislative document that transformed the way that water quality monitoring is undertaken across all Member States by introducing the Ecological and Chemical Status. A “good or higher” Ecological Status is expected to be achieved for all waterbodies in Europe by 2015. Yet, most of the European waterbodies, which are determined to be at risk, or of moderate to bad quality, further information will be required so that adequate remediation strategies can be implemented. To date, water quality evaluation is based on five biological components (phytoplankton, macrophytes and benthic algae, macroinvertebrates and fishes) and various hydromorphological and physicochemical elements. The evaluation of the chemical status is principally based on 33 priority substances and on 12 xenobiotics, considered as dangerous for the environment. This approach takes into account only a part of the numerous xenobiotics that can be present in surface waters and could not evidence all the possible causes of ecotoxicological stress that can act in a water section. The mixtures of toxic chemicals may constitute an ecological risk not predictable on the basis of the single component concentration. To improve water quality, sources of contamination and causes of ecological alterations need to be identified. On the other hand, the analysis of the community structure, which is the result of multiple processes, including hydrological constrains and physico-chemical stress, give back only a “photograph” of the actual status of a site without revealing causes and sources of the perturbation. A multidisciplinary approach, able to integrate the information obtained by different methods, such as community structure analysis and eco-genotoxicological studies, could help overcome some of the difficulties in properly identifying the different causes of stress in risk assessment. In synthesis, the river ecological status is the result of a combination of multiple pressures that, for management purposes and quality improvement, have to be disentangled from each other. To reduce actual uncertainty in risk assessment, methods that establish quantitative links between levels of contamination and community alterations are needed. The analysis of macrobenthic invertebrate community structure has been widely used to identify sites subjected to perturbation. Trait-based descriptors of community structure constitute a useful method in ecological risk assessment. The diagnostic capacity of freshwater biomonitoring could be improved by chronic sublethal toxicity testing of water and sediment samples. Requiring an exposure time that covers most of the species’ life cycle, chronic toxicity tests are able to reveal negative effects on life-history traits at contaminant concentrations well below the acute toxicity level. Furthermore, the responses of high-level endpoints (growth, fecundity, mortality) can be integrated in order to evaluate the impact on population’s dynamics, a highly relevant endpoint from the ecological point of view. To gain more accurate information about potential causes and consequences of environmental contamination, the evaluation of adverse effects at physiological, biochemical and genetic level is also needed. The use of different biomarkers and toxicity tests can give information about the sub-lethal and toxic load of environmental compartments. Biomarkers give essential information about the exposure to toxicants, such as endocrine disruptor compounds and genotoxic substances whose negative effects cannot be evidenced by using only high-level toxicological endpoints. The increasing presence of genotoxic pollutants in the environment has caused concern regarding the potential harmful effects of xenobiotics on human health, and interest on the development of new and more sensitive methods for the assessment of mutagenic and cancerogenic risk. Within the WFD, biomarkers and bioassays are regarded as important tools to gain lines of evidence for cause-effect relationship in ecological quality assessment. Despite the scientific community clearly addresses the advantages and necessity of an ecotoxicological approach within the ecological quality assessment, a recent review reports that, more than one decade after the publication of the WFD, only few studies have attempted to integrate ecological water status assessment and biological methods (namely biomarkers or bioassays). None of the fifteen reviewed studies included both biomarkers and bioassays. The integrated approach developed in this PhD Thesis comprises a set of laboratory bioassays (Daphnia magna acute and chronic toxicity tests, Comet Assay and FPG-Comet) newly-developed, modified tacking a cue from standardized existing protocols or applied for freshwater quality testing (ecotoxicological, genotoxicological and toxicogenomic assays), coupled with field investigations on macrobenthic community structures (SPEAR and EBI indexes). Together with the development of new bioassays with Daphnia magna, the feasibility of eco-genotoxicological testing of freshwater and sediment quality with Heterocypris incongruens was evaluated (Comet Assay and a protocol for chronic toxicity). However, the Comet Assay, although standardized, was not applied to freshwater samples due to the lack of sensitivity of this species observed after 24h of exposure to relatively high (and not environmentally relevant) concentrations of reference genotoxicants. Furthermore, this species demonstrated to be unsuitable also for chronic toxicity testing due to the difficult evaluation of fecundity as sub-lethal endpoint of exposure and complications due to its biology and behaviour. The study was applied to a pilot hydrographic sub-Basin, by selecting section subjected to different levels of anthropogenic pressure: this allowed us to establish the reference conditions, to select the most significant endpoints and to evaluate the coherence of the responses of the different lines of evidence (alteration of community structure, eco-genotoxicological responses, alteration of gene expression profiles) and, finally, the diagnostic capacity of the monitoring strategy. Significant correlations were found between the genotoxicological parameter Tail Intensity % (TI%) and macrobenthic community descriptors SPEAR (p<0.001) and EBI (p<0.05), between the genotoxicological parameter describing DNA oxidative stress (ΔTI%) and mean levels of nitrates (p<0.01) and between reproductive impairment (Failed Development % from D. magna chronic bioassays) and TI% (p<0.001) as well as EBI (p<0.001). While correlation among parameters demonstrates a general coherence in the response to increasing impacts, the concomitant ability of each single endpoint to be responsive to specific sources of stress is at the basis of the diagnostic capacity of the integrated approach as demonstrated by stations presenting a mismatch among the different lines of evidence. The chosen set of bioassays, as well as the selected endpoints, are not providing redundant indications on the water quality status but, on the contrary, are contributing with complementary pieces of information about the several stressors that insist simultaneously on a waterbody section providing this monitoring strategy with a solid diagnostic capacity. Our approach should provide opportunities for the integration of biological effects into monitoring programmes for surface water, especially in investigative monitoring. Moreover, it should provide a more realistic assessment of impact and exposure of aquatic organisms to contaminants. Finally this approach should provide an evaluation of drivers of change in biodiversity and its causalities on ecosystem function/services provision, that is the direct and indirect contributions to human well-being.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This PhD thesis belongs to three main knowledge domains: operations management, environmental management, and decision making. Having the automotive industry as the key sector, the investigation was undertaken aiming at deepening the understanding of environmental decision making processes in the operations function. The central research question for this thesis is ?Why and how do manufacturing companies take environmental decisions? This PhD research project used a case study research strategy supplemented by secondary data analysis and the testing and evaluation of a proposed systems thinking model for environmental decision making. Interviews and focus groups were the main methods for data collection. The findings of the thesis show that companies that want to be in the environmental leadership will need to take environmental decisions beyond manufacturing processes. Because the benefits (including financial gain) of non-manufacturing activities are not clear yet the decisions related to product design, supply chain and facilities are fully embedded with complexity, subjectivism, and intrinsic risk. Nevertheless, this is the challenge environmental leaders will face - they may enter in a paradoxical state of their decisions – where although the risk of going greener is high, the risk of not doing it is even higher.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A multi-variate descriptive model of environmental and nature conservation attitudes and values is proposed and empirically supported. A mapping sentence is developed out of analysis of data from a series of Repertory Grid interviews addressing conservation employees' attitudes towards their profession's activities. The research is carried out within the meta-theoretical framework of Facet Theory. A mapping sentence is developed consisting of 9 facets. From the mapping sentence 3 questionnaires were constructed viewing the selective orientations towards environmental concern. A mapping sentence and facet model is developed for each study. Once the internal structure of this model had been established using Similarity Structure Analysis, the elements of the facets are subjected to Partial Order Scalogram Analysis with base coordinates. A questionnaire was statistically analysed to assess the relationship between facet elements and 4 measures of attitudes towards, and involvement with, conservation. This enabled the comparison of the relative strengths of attitudes associated with each facet element and each measure of conservation attitude. In general, the relationship between the social value of conservation and involvement pledges to conservation were monotonic; perceived importance of a conservation issue appearing predictive of personal involvement. Furthermore, the elements of the life area and scale facets were differentially related to attitude measures. The multi-variate descriptive model of environmental conservation values and attitudes is discussed in relation to its implications for psychological research into environmental concern and for environmental and nature conservation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A series of N1-benzylideneheteroarylcarboxamidrazones was prepared in an automated fashion, and tested against Mycobacterium fortuitum in a rapid screen for antimycobacterial activity. Many of the compounds from this series were also tested against Mycobacterium tuberculosis, and the usefulness as M.fortuitum as a rapid, initial screen for anti-tubercular activity evaluated. Various deletions were made to the N1-benzylideneheteroarylcarboxamidrazone structure in order to establish the minimum structural requirements for activity. The N1-benzylideneheteroarylcarbox-amidrazones were then subjected to molecular modelling studies and their activities against M.fortuitum and M.tuberculosis were analysed using quantitative structure-analysis relationship (QSAR) techniques in the computational package TSAR (Oxford Molecular Ltd.). A set of equations predictive of antimycobacterial activity was hereby obtained. The series of N1-benzylidenehetero-arylcarboxamidrazones was also tested against a multidrug-resistant strain of Staphylococcus aureus (MRSA), followed by a panel of Gram-positive and Gram-negative bacteria, if activity was observed for MRSA. A set of antimycobacterial N1-benzylideneheteroarylcarboxamidrazones was hereby discovered, the best of which had MICs against m. fortuitum in the range 4-8μgml-1 and displayed 94% inhibition of M.tuberculosis at a concentration of 6.25μgml-1. The antimycobacterial activity of these compounds appeared to be specific, since the same compounds were shown to be inactive against other classes of organisms. Compounds which were found to be sufficiently active in any screen were also tested for their toxicity against human mononuclear leucocytes. Polyethylene glycol (PEG) was used as a soluble polymeric support for the synthesis of some fatty acid derivatives, containing an isoxazoline group, which may inhibit mycolic acid synthesis in mycobacteria. Both the PEG-bound products and the cleaved, isolated products themselves were tested against M.fortuitum and some low levels of antimycobacterial activity were observed, which may serve as lead compounds for further studies.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background. The secondary structure of folded RNA sequences is a good model to map phenotype onto genotype, as represented by the RNA sequence. Computational studies of the evolution of ensembles of RNA molecules towards target secondary structures yield valuable clues to the mechanisms behind adaptation of complex populations. The relationship between the space of sequences and structures, the organization of RNA ensembles at mutation-selection equilibrium, the time of adaptation as a function of the population parameters, the presence of collective effects in quasispecies, or the optimal mutation rates to promote adaptation all are issues that can be explored within this framework. Results. We investigate the effect of microscopic mutations on the phenotype of RNA molecules during their in silico evolution and adaptation. We calculate the distribution of the effects of mutations on fitness, the relative fractions of beneficial and deleterious mutations and the corresponding selection coefficients for populations evolving under different mutation rates. Three different situations are explored: the mutation-selection equilibrium (optimized population) in three different fitness landscapes, the dynamics during adaptation towards a goal structure (adapting population), and the behavior under periodic population bottlenecks (perturbed population). Conclusions. The ratio between the number of beneficial and deleterious mutations experienced by a population of RNA sequences increases with the value of the mutation rate µ at which evolution proceeds. In contrast, the selective value of mutations remains almost constant, independent of µ, indicating that adaptation occurs through an increase in the amount of beneficial mutations, with little variations in the average effect they have on fitness. Statistical analyses of the distribution of fitness effects reveal that small effects, either beneficial or deleterious, are well described by a Pareto distribution. These results are robust under changes in the fitness landscape, remarkably when, in addition to selecting a target secondary structure, specific subsequences or low-energy folds are required. A population perturbed by bottlenecks behaves similarly to an adapting population, struggling to return to the optimized state. Whether it can survive in the long run or whether it goes extinct depends critically on the length of the time interval between bottlenecks. © 2010 Stich et al; licensee BioMed Central Ltd.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Post-disaster housing reconstruction projects face several challenges. Resources and material supplies are often scarce; several and different types of organizations are involved, while projects must be completed as quickly as possible to foster recovery. Within this context, the chapter aims to increase the understanding of relief supply chain design in reconstruction. In addition, the chapter is introducing a community based and beneficiary perspective to relief supply chains by evaluating the implications of local components for supply chain design in reconstruction. This is achieved through the means of secondary data analysis based on the evaluation reports of two major housing reconstruction projects that took place in Europe the last decade. A comparative analysis of the organizational designs of these projects highlights the ways in which users can be involved. The performance of reconstruction supply chains seems to depend to a large extent on the way beneficiaries are integrated in supply chain design impacting positively on the effectiveness of reconstruction supply chains.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

PURPOSE: Previous investigations have demonstrated a relative vascular autoregulatory inefficiency of the inferior compared to the superior retina in healthy subjects breathing increased CO2. The purpose of this study was to determine whether the superior and inferior visual field sensitivities of healthy eyes are similarly affected during mild hypercapnia. DESIGN: Experimental study. METHODS: Visual field analysis (Humphrey Field Analyser; SITA standard 24-2 program) was carried out on one randomly selected eye of 22 subjects (mean age, 27.7 ± 5 years) during normal room air breathing and isoxic hypercapnia. The Student paired t-tests were used to compare the visual field indices mean deviation (MD) and pattern standard deviation (PSD) for each breathing condition. A secondary, sectoral analysis of mean pointwise sensitivity was performed for each condition. In each case a P value of <.01 was considered statistically significant (Bonferroni corrected). RESULTS: Visual field MD was -0.23 ± 0.95dB during room air breathing and -0.49 ± 1.04dB during hypercapnia (P = .034). Sectoral pointwise mean sensitivity deteriorated by 0.46dB (P = .006) in the upper visual hemifield during hypercapnia, whereas no significant difference was observed for the lower hemifield (P = .331). CONCLUSIONS: The upper visual hemifield exhibited a significantly greater degree of deterioration in pointwise visual field mean sensitivity compared to the lower hemifield during hypercapnic conditions. This suggests that the upper visual hemifield and hence inferior retina is more susceptible to insult during hypercapnia than the superior retina in healthy individuals. A regional susceptibility of inferior retinal function to altered vascular or metabolic effects may account for the earlier and more frequent inferior nerve fibre damage associated with glaucomatous optic neuropathy. © 2003 by Elsevier Science Inc. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Full text: The idea of producing proteins from recombinant DNA hatched almost half a century ago. In his PhD thesis, Peter Lobban foresaw the prospect of inserting foreign DNA (from any source, including mammalian cells) into the genome of a λ phage in order to detect and recover protein products from Escherichia coli [ 1 and 2]. Only a few years later, in 1977, Herbert Boyer and his colleagues succeeded in the first ever expression of a peptide-coding gene in E. coli — they produced recombinant somatostatin [ 3] followed shortly after by human insulin. The field has advanced enormously since those early days and today recombinant proteins have become indispensable in advancing research and development in all fields of the life sciences. Structural biology, in particular, has benefitted tremendously from recombinant protein biotechnology, and an overwhelming proportion of the entries in the Protein Data Bank (PDB) are based on heterologously expressed proteins. Nonetheless, synthesizing, purifying and stabilizing recombinant proteins can still be thoroughly challenging. For example, the soluble proteome is organized to a large part into multicomponent complexes (in humans often comprising ten or more subunits), posing critical challenges for recombinant production. A third of all proteins in cells are located in the membrane, and pose special challenges that require a more bespoke approach. Recent advances may now mean that even these most recalcitrant of proteins could become tenable structural biology targets on a more routine basis. In this special issue, we examine progress in key areas that suggests this is indeed the case. Our first contribution examines the importance of understanding quality control in the host cell during recombinant protein production, and pays particular attention to the synthesis of recombinant membrane proteins. A major challenge faced by any host cell factory is the balance it must strike between its own requirements for growth and the fact that its cellular machinery has essentially been hijacked by an expression construct. In this context, Bill and von der Haar examine emerging insights into the role of the dependent pathways of translation and protein folding in defining high-yielding recombinant membrane protein production experiments for the common prokaryotic and eukaryotic expression hosts. Rather than acting as isolated entities, many membrane proteins form complexes to carry out their functions. To understand their biological mechanisms, it is essential to study the molecular structure of the intact membrane protein assemblies. Recombinant production of membrane protein complexes is still a formidable, at times insurmountable, challenge. In these cases, extraction from natural sources is the only option to prepare samples for structural and functional studies. Zorman and co-workers, in our second contribution, provide an overview of recent advances in the production of multi-subunit membrane protein complexes and highlight recent achievements in membrane protein structural research brought about by state-of-the-art near-atomic resolution cryo-electron microscopy techniques. E. coli has been the dominant host cell for recombinant protein production. Nonetheless, eukaryotic expression systems, including yeasts, insect cells and mammalian cells, are increasingly gaining prominence in the field. The yeast species Pichia pastoris, is a well-established recombinant expression system for a number of applications, including the production of a range of different membrane proteins. Byrne reviews high-resolution structures that have been determined using this methylotroph as an expression host. Although it is not yet clear why P. pastoris is suited to producing such a wide range of membrane proteins, its ease of use and the availability of diverse tools that can be readily implemented in standard bioscience laboratories mean that it is likely to become an increasingly popular option in structural biology pipelines. The contribution by Columbus concludes the membrane protein section of this volume. In her overview of post-expression strategies, Columbus surveys the four most common biochemical approaches for the structural investigation of membrane proteins. Limited proteolysis has successfully aided structure determination of membrane proteins in many cases. Deglycosylation of membrane proteins following production and purification analysis has also facilitated membrane protein structure analysis. Moreover, chemical modifications, such as lysine methylation and cysteine alkylation, have proven their worth to facilitate crystallization of membrane proteins, as well as NMR investigations of membrane protein conformational sampling. Together these approaches have greatly facilitated the structure determination of more than 40 membrane proteins to date. It may be an advantage to produce a target protein in mammalian cells, especially if authentic post-translational modifications such as glycosylation are required for proper activity. Chinese Hamster Ovary (CHO) cells and Human Embryonic Kidney (HEK) 293 cell lines have emerged as excellent hosts for heterologous production. The generation of stable cell-lines is often an aspiration for synthesizing proteins expressed in mammalian cells, in particular if high volumetric yields are to be achieved. In his report, Buessow surveys recent structures of proteins produced using stable mammalian cells and summarizes both well-established and novel approaches to facilitate stable cell-line generation for structural biology applications. The ambition of many biologists is to observe a protein's structure in the native environment of the cell itself. Until recently, this seemed to be more of a dream than a reality. Advances in nuclear magnetic resonance (NMR) spectroscopy techniques, however, have now made possible the observation of mechanistic events at the molecular level of protein structure. Smith and colleagues, in an exciting contribution, review emerging ‘in-cell NMR’ techniques that demonstrate the potential to monitor biological activities by NMR in real time in native physiological environments. A current drawback of NMR as a structure determination tool derives from size limitations of the molecule under investigation and the structures of large proteins and their complexes are therefore typically intractable by NMR. A solution to this challenge is the use of selective isotope labeling of the target protein, which results in a marked reduction of the complexity of NMR spectra and allows dynamic processes even in very large proteins and even ribosomes to be investigated. Kerfah and co-workers introduce methyl-specific isotopic labeling as a molecular tool-box, and review its applications to the solution NMR analysis of large proteins. Tyagi and Lemke next examine single-molecule FRET and crosslinking following the co-translational incorporation of non-canonical amino acids (ncAAs); the goal here is to move beyond static snap-shots of proteins and their complexes and to observe them as dynamic entities. The encoding of ncAAs through codon-suppression technology allows biomolecules to be investigated with diverse structural biology methods. In their article, Tyagi and Lemke discuss these approaches and speculate on the design of improved host organisms for ‘integrative structural biology research’. Our volume concludes with two contributions that resolve particular bottlenecks in the protein structure determination pipeline. The contribution by Crepin and co-workers introduces the concept of polyproteins in contemporary structural biology. Polyproteins are widespread in nature. They represent long polypeptide chains in which individual smaller proteins with different biological function are covalently linked together. Highly specific proteases then tailor the polyprotein into its constituent proteins. Many viruses use polyproteins as a means of organizing their proteome. The concept of polyproteins has now been exploited successfully to produce hitherto inaccessible recombinant protein complexes. For instance, by means of a self-processing synthetic polyprotein, the influenza polymerase, a high-value drug target that had remained elusive for decades, has been produced, and its high-resolution structure determined. In the contribution by Desmyter and co-workers, a further, often imposing, bottleneck in high-resolution protein structure determination is addressed: The requirement to form stable three-dimensional crystal lattices that diffract incident X-ray radiation to high resolution. Nanobodies have proven to be uniquely useful as crystallization chaperones, to coax challenging targets into suitable crystal lattices. Desmyter and co-workers review the generation of nanobodies by immunization, and highlight the application of this powerful technology to the crystallography of important protein specimens including G protein-coupled receptors (GPCRs). Recombinant protein production has come a long way since Peter Lobban's hypothesis in the late 1960s, with recombinant proteins now a dominant force in structural biology. The contributions in this volume showcase an impressive array of inventive approaches that are being developed and implemented, ever increasing the scope of recombinant technology to facilitate the determination of elusive protein structures. Powerful new methods from synthetic biology are further accelerating progress. Structure determination is now reaching into the living cell with the ultimate goal of observing functional molecular architectures in action in their native physiological environment. We anticipate that even the most challenging protein assemblies will be tackled by recombinant technology in the near future.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The paper investigates the role of current and future IT applications in 3PL services in Europe and the Far East. For clients' competitive advantage, 3PL providers increasingly contribute IT systems to logistics, helping 3PL providers to enhance supply chain collaboration with business partners. Through qualitative interviews, questionnaires and secondary data analysis, common attributes of both regions' IT systems are identified which enable supply chain partners to collaborate and share information. Most companies already implement IT systems for processing transactions, but recognized motivations and barriers remain, since 3PL providers incompletely understand clients' IT requirements. Long-term productivity gains require sophisticated IT systems to streamline cycles and improve supply chain visibility, thus facilitating planning and decision-making.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study identifies and describes HIV Voluntary Counseling and Testing (VCT) of middle aged and older Latinas. The rate of new cases of HIV in people age 45 and older is rapidly increasing, with a 40.6% increase in the numbers of older Latinas infected with HIV between 1998 and 2002. Despite this increase, there is paucity of research on this population. This research seeks to address the gap through a secondary data analysis of Latina women. The aim of this study is twofold: (1) Develop and empirically test a multivariate model of VCT utilization for middle aged and older Latinas; (2) To test how the three individual components of the Andersen Behavioral Model impact VCT for middle aged and older Latinas. The study is organized around the three major domains of the Andersen Behavioral Model of service use that include: (a) predisposing factors; (b) enabling characteristics and (c) need. Logistic regression using structural equation modeling techniques were used to test multivariate relationships of variables on VCT for a sample of 135 middle age and older Latinas residing in Miami-Dade County, Florida. Over 60% of participants had been tested for HIV. Provider endorsement was found to he the strongest predictor of VCT (odds ration [OR] 6.38), followed by having a clinic as a regular source of healthcare (OR=3.88). Significant negative associations with VCT included self rated health status (OR=.592); Age (OR=.927); Spanish proficiency (OR=.927); number of sexual partners (OR=.613) and consumption of alcohol during sexual activity (.549). As this line of inquiry provides a critical glimpse into the VCT of older Latinas, recommendations for enhanced service provision and research will he offered.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

There is a commonly presumed link among sexual risk behavior, substance use, and other psychosocial factors among adolescents. However, these relationships have been relatively understudied in detained, low-income, minority, substance abusing adolescents. This study addresses this gap in the literature with a secondary data analysis based on a sample of adolescent offenders in two detention and treatment centers in Miami-Dade County. Univariate, bivariate statistical analysis and multivariate logistic regressions were conducted on baseline data from structured interviews with 455 adolescents participating in an NIH funded prevention intervention. Data were analyzed to assess relationships among self-reported substance use, STD history, HIV/AIDS knowledge, condom use, condom use attitudes, and skills, peer and parental approval to use condoms, and race/ethnicity. The adolescent sample was 74.1% male, and 25.9% female and 35.4% African American, 25.1% non-African American Latino, 11.2% White, and 28.4% of other race/ethnicity categories. The mean age was 15.6 years. Results suggested that alcohol use (p < 0.001) and use of marijuana, cocaine and other drugs (p < 0.001) are significant variables when explaining the variability in sexual risk behaviors. Results also suggested that unprotected vaginal, anal, and oral sex increased with higher alcohol and drug use (p < 0.001) and that positive attitudes about personally using condoms (p < 0.001) were also significantly related to condom use. Logistic regressions showed that race/ethnicity was a significant control variable when explaining the variability of condom use. Being White and Latino were significantly associated with less condom use during oral and anal sex when compared to other racial/ethnic groups. These results indicated that risky sexual behavior and HIV infection risk are significantly associated with substance use, particularly alcohol use. Therefore, proper screening and identification of alcohol use, and condom use attitudes could maximize the efficacy of referrals to programs targeting both issues and increase the potential for appropriate primary and secondary prevention and treatment among adolescent detainees.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study investigated the nature and impact of the sexual abuse of children ages birth through 6 years. The purpose was to enhance knowledge about this understudied population through examination of: (1) characteristics of the abuse; (2) socioemotional developmental outcomes of young victims; and (3) potential moderating effects of family dynamics. An ecological-developmental theoretical framework was applied. Secondary data analysis was conducted using data collected from the consortium Longitudinal Studies of Child Abuse and Neglect (LONGSCAN). A sample of 250 children was drawn from LONGSCAN data, including children who were sexually abused (n=125) and their nonabused counterparts (n=125), matched on demographic variables. Results revealed that young victims of sexual abuse were disproportionately female (91 girls; 73%). The sexual abuse committed against these youngsters was severe in nature, with 111 children (89%) experiencing contact offenses ranging from fondling to forcible rape. Sixty-two percent of child victims demonstrated borderline, clinical, or less than adequate functioning on normative, expected socioemotional outcomes. Child victims reported low degrees of perceived competence and satisfaction in the social environment. When compared with their nonabused counterparts, child victims demonstrated significantly poorer socioemotional functioning, as evidenced by aggressive behaviors, attention and thought problems. Sexually abused youngsters also reported lower self-perceptions of cognitive and physical competence and maternal acceptance. Family dynamic factors did not significantly moderate the relationships between abuse and socioemotional outcomes, with one exception. The caregivers’ degree of empathy for their children had a significant moderating effect on the children’s social problems. This study contributes to an otherwise scant body of literature on the sexual abuse of preschoolers. Findings provide implications for social work practice, especially in the development of assessment and prevention strategies.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

I conducted this study to provide insights toward deepening understanding of association between culture and writing by building, assessing, and refining a conceptual model of second language writing. To do this, I examined culture and coherence as well as the relationship between them through a mixed methods research design. Coherence has been an important and complex concept in ESL/EFL writing. I intended to study the concept of coherence in the research context of contrastive rhetoric, comparing the coherence quality in argumentative essays written by undergraduates in Mainland China and their U.S. peers. In order to analyze the complex concept of coherence, I synthesized five linguistic theories of coherence: Halliday and Hasan's cohesion theory, Carroll's theory of coherence, Enkvist's theory of coherence, Topical Structure Analysis, and Toulmin's Model. Based upon the synthesis, 16 variables were generated. Across these 16 variables, Hotelling t-test statistical analysis was conducted to predict differences in argumentative coherence between essays written by two groups of participants. In order to complement the statistical analysis, I conducted 30 interviews of the writers in the studies. Participants' responses were analyzed with open and axial coding. By analyzing the empirical data, I refined the conceptual model by adding more categories and establishing associations among them. The study found that U.S. students made use of more pronominal reference. Chinese students adopted more lexical devices of reiteration and extended paralleling progression. The interview data implied that the difference may be associated with the difference in linguistic features and rhetorical conventions in Chinese and English. As far as Toulmin's Model is concerned, Chinese students scored higher on data than their U.S. peers. According to the interview data, this may be due to the fact that Toulmin's Model, modified as three elements of arguments, have been widely and long taught in Chinese writing instruction while U.S. interview participants said that they were not taught to write essays according to Toulmin's Model. Implications were generated from the process of textual data analysis and the formulation of structural model defining coherence. These implications were aimed at informing writing instruction, assessment, peer-review, and self-revision.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The move from Standard Definition (SD) to High Definition (HD) represents a six times increases in data, which needs to be processed. With expanding resolutions and evolving compression, there is a need for high performance with flexible architectures to allow for quick upgrade ability. The technology advances in image display resolutions, advanced compression techniques, and video intelligence. Software implementation of these systems can attain accuracy with tradeoffs among processing performance (to achieve specified frame rates, working on large image data sets), power and cost constraints. There is a need for new architectures to be in pace with the fast innovations in video and imaging. It contains dedicated hardware implementation of the pixel and frame rate processes on Field Programmable Gate Array (FPGA) to achieve the real-time performance. ^ The following outlines the contributions of the dissertation. (1) We develop a target detection system by applying a novel running average mean threshold (RAMT) approach to globalize the threshold required for background subtraction. This approach adapts the threshold automatically to different environments (indoor and outdoor) and different targets (humans and vehicles). For low power consumption and better performance, we design the complete system on FPGA. (2) We introduce a safe distance factor and develop an algorithm for occlusion occurrence detection during target tracking. A novel mean-threshold is calculated by motion-position analysis. (3) A new strategy for gesture recognition is developed using Combinational Neural Networks (CNN) based on a tree structure. Analysis of the method is done on American Sign Language (ASL) gestures. We introduce novel point of interests approach to reduce the feature vector size and gradient threshold approach for accurate classification. (4) We design a gesture recognition system using a hardware/ software co-simulation neural network for high speed and low memory storage requirements provided by the FPGA. We develop an innovative maximum distant algorithm which uses only 0.39% of the image as the feature vector to train and test the system design. Database set gestures involved in different applications may vary. Therefore, it is highly essential to keep the feature vector as low as possible while maintaining the same accuracy and performance^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Purpose: Most individuals do not perceive a need for substance use treatment despite meeting diagnostic criteria for substance use disorders and they are least likely to pursue treatment voluntarily. There are also those who perceive a need for treatment and yet do not pursue it. This study aimed to understand which factors increase the likelihood of perceiving a need for treatment for individuals who meet diagnostic criteria for substance use disorders in the hopes to better assist with more targeted efforts for gender-specific treatment recruitment and retention. Using Andersen and Newman's (1973/2005) model of individual determinants of healthcare utilization, the central hypothesis of the study was that gender moderates the relationship between substance use problem severity and perceived treatment need, so that women with increasing problems due to their use of substances are more likely than men to perceive a need for treatment. Additional predisposing and enabling factors from Andersen and Newman's (1973/2005) model were included in the study to understand their impact on perceived need. Method: The study was a secondary data analysis of the 2010 National Survey on Drug Use and Health (NSDUH) using logistic regression. The weighted sample consisted of a total 20,077,235 American household residents (The unweighted sample was 5,484 participants). Results of the logistic regression were verified using Relogit software for rare events logistic regression due to the rare event of perceived treatment need (King & Zeng, 2001a; 2001b). Results: The moderating effect of female gender was not found. Conversely, men were significantly more likely than women to perceive a need for treatment as substance use problem severity increased. The study also found that a number of factors such as race, ethnicity, socioeconomic status, age, marital status, education, co-occurring mental health disorders, and prior treatment history differently impacted the likelihood of perceiving a need for treatment among men and women. Conclusion: Perceived treatment need among individuals who meet criteria for substance use disorders is rare, but identifying factors associated with an increased likelihood of perceiving need for treatment can help the development of gender-appropriate outreach and recruitment for social work treatment, and public health messages.