922 resultados para work system


Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Bioluminescence imaging is widely used for cell-based assays and animal imaging studies, both in biomedical research and drug development. Its main advantages include its high-throughput applicability, affordability, high sensitivity, operational simplicity, and quantitative outputs. In malaria research, bioluminescence has been used for drug discovery in vivo and in vitro, exploring host-pathogen interactions, and studying multiple aspects of Plasmodium biology. While the number of fluorescent proteins available for imaging has undergone a great expansion over the last two decades, enabling simultaneous visualization of multiple molecular and cellular events, expansion of available luciferases has lagged behind. The most widely used bioluminescent probe in malaria research is the Photinus pyralis firefly luciferase, followed by the more recently introduced Click-beetle and Renilla luciferases. Ultra-sensitive imaging of Plasmodium at low parasite densities has not been previously achieved. With the purpose of overcoming these challenges, a Plasmodium berghei line expressing the novel ultra-bright luciferase enzyme NanoLuc, called PbNLuc has been generated, and is presented in this work. RESULTS: NanoLuc shows at least 150 times brighter signal than firefly luciferase in vitro, allowing single parasite detection in mosquito, liver, and sexual and asexual blood stages. As a proof-of-concept, the PbNLuc parasites were used to image parasite development in the mosquito, liver and blood stages of infection, and to specifically explore parasite liver stage egress, and pre-patency period in vivo. CONCLUSIONS: PbNLuc is a suitable parasite line for sensitive imaging of the entire Plasmodium life cycle. Its sensitivity makes it a promising line to be used as a reference for drug candidate testing, as well as the characterization of mutant parasites to explore the function of parasite proteins, host-parasite interactions, and the better understanding of Plasmodium biology. Since the substrate requirements of NanoLuc are different from those of firefly luciferase, dual bioluminescence imaging for the simultaneous characterization of two lines, or two separate biological processes, is possible, as demonstrated in this work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Asteroid 2008 TC3 (approximately 4m diameter) was tracked and studied in space for approximately 19h before it impacted Earth's atmosphere, shattering at 44-36km altitude. The recovered samples (>680 individual rocks) comprise the meteorite Almahata Sitta (AhS). Approximately 50-70% of these are ureilites (ultramafic achondrites). The rest are chondrites, mainly enstatite, ordinary, and Rumuruti types. The goal of this work is to understand how fragments of so many different types of parent bodies became mixed in the same asteroid. Almahata Sitta has been classified as a polymict ureilite with an anomalously high component of foreign clasts. However, we calculate that the mass of fallen material was 0.1% of the pre-atmospheric mass of the asteroid. Based on published data for the reflectance spectrum of the asteroid and laboratory spectra of the samples, we infer that the lost material was mostly ureilitic. Therefore, 2008 TC3 probably contained only a few percent nonureilitic materials, similar to other polymict ureilites except less well consolidated. From available data for the AhS meteorite fragments, we conclude that 2008 TC3 samples essentially the same range of types of ureilitic and nonureilitic materials as other polymict ureilites. We therefore suggest that the immediate parent of 2008 TC3 was the immediate parent of all ureilitic material sampled on Earth. We trace critical stages in the evolution of that material through solar system history. Based on various types of new modeling and re-evaluation of published data, we propose the following scenario. (1) The ureilite parent body (UPB) accreted 0.5-0.6Ma after formation of calcium-aluminum-rich inclusions (CAI), beyond the ice line (outer asteroid belt). Differentiation began approximately 1Ma after CAI. (2) The UPB was catastrophically disrupted by a major impact approximately 5Ma after CAI, with selective subsets of the fragments reassembling into daughter bodies. (3) Either the UPB (before breakup), or one of its daughters (after breakup), migrated to the inner belt due to scattering by massive embryos. (4) One daughter (after forming in or migrating to the inner belt) became the parent of 2008 TC3. It developed a regolith, mostly 3.8Ga ago. Clasts of enstatite, ordinary, and Rumuruti-type chondrites were implanted by low-velocity collisions. (5) Recently, the daughter was disrupted. Fragments were injected or drifted into Earth-crossing orbits. 2008 TC3 comes from outer layers of regolith, other polymict ureilites from deeper regolith, and main group ureilites from the interior of this body. In contrast to other models that have been proposed, this model invokes a stochastic history to explain the unique diversity of foreign materials in 2008 TC3 and other polymict ureilites.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work is aimed at improving our current knowledge of the non-enzymatic inecl~anisins involved in brown-rot decay, as well as the exploration of potential applications of a brown-rot mimetic model system in paper recycling processes. The study was divided into two parts. The first part focussed on the chemical mechanisms involved in chelation and reduction of iron by a low molecular weight chelator (isolated from the brown-rot fungus Gloeophyllz~m tmbeum) and its model compound 2,3- dihydroxybenzoic acid (2,3-DHBA). Chelation as well as free radical generation mediated by this system were studied by ESR measurement. The results indicate that the effects of the chelator/iron ratio, the pH, and other reaction parameters on hydroxyl radical generation by a Fenton type system could be determined using ESR spin-trapping techniques. The results also support the hypothesis that superoxide radicals are involved in the chelator-mediated Fenton process. In the second part of the study, the effect of a chelator-mediated Fenton system for the improvement of deinking efficiency and the n~odification of fiber and paper properties was studied. For the deinking study, copy paper was laser printed with an identical standard pattern. Then repulping and flotation operations were performed to remove ink particles. Under properly controlled deinking conditions, the chelator mediated treatment (CMT) resulted in a reduction in dirt count over that of conventional deinking procedures with no significant loss of pulp strength. To study the effect of the chelator system treatment on the quality of pulp with different fines content, a fully bleached hardwood kraft pulp was beaten to different freeness levels and treated with the chelator-mediated free radical system. The result shows that virgin fiber and heavily beaten fiber respond differently to the free radical treatment. Unbeaten fibers become more flexible and easier to collapse after free radical treatment, while beaten fibers show a reduction in fines and small materials after mild free radical treatment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The study of operations on representations of objects is well documented in the realm of spatial engineering. However, the mathematical structure and formal proof of these operational phenomena are not thoroughly explored. Other works have often focused on query-based models that seek to order classes and instances of objects in the form of semantic hierarchies or graphs. In some models, nodes of graphs represent objects and are connected by edges that represent different types of coarsening operators. This work, however, studies how the coarsening operator "simplification" can manipulate partitions of finite sets, independent from objects and their attributes. Partitions that are "simplified first have a collection of elements filtered (removed), and then the remaining partition is amalgamated (some sub-collections are unified). Simplification has many interesting mathematical properties. A finite composition of simplifications can also be accomplished with some single simplification. Also, if one partition is a simplification of the other, the simplified partition is defined to be less than the other partition according to the simp relation. This relation is shown to be a partial-order relation based on simplification. Collections of partitions can not only be proven to have a partial- order structure, but also have a lattice structure and are complete. In regard to a geographic information system (GIs), partitions related to subsets of attribute domains for objects are called views. Objects belong to different views based whether or not their attribute values lie in the underlying view domain. Given a particular view, objects with their attribute n-tuple codings contained in the view are part of the actualization set on views, and objects are labeled according to the particular subset of the view in which their coding lies. Though the scope of the work does not mainly focus on queries related directly to geographic objects, it provides verification for the existence of particular views in a system with this underlying structure. Given a finite attribute domain, one can say with mathematical certainty that different views of objects are partially ordered by simplification, and every collection of views has a greatest lower bound and least upper bound, which provides the validity for exploring queries in this regard.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Carrabassett Valley Sanitary District in Carrabassett Valley, Maine has utilized both a forest spray irrigation system and a Snowfluent™ system for the treatment of their wastewater effluent. This study was designed to evaluate potential changes in soil properties after approximately 20 years of treatment in the forested spray irrigation site and three years of treatment in the field Snowfluent™ site. In addition, grass yield and composition were evaluated on the field study sites. After treatment with effluent or Snowfluent™, soils showed an increase in soil exchangeable Ca, Mg, Na, and K, base saturation, and pH. While most constituents were higher in treated soils, available P was lower in treated soils compared to the controls. This difference was attributed to higher rates of P mineralization from soil organic matter due to an irrigation effect of the treatment, depleting available P pools despite the P addition with the treatment. Most of the differences due to treatment were greatest at the surface and diminished with depth. Depth patterns in soil properties mostly reflected the decreasing influence of organic matter and its decomposition products with depth as evidenced by significantly higher total C in the surface compared to lower horizons. There were decreasing concentrations of total N, and exchangeable or extractable Ca, Mg, Na, K, Mn, Zn, and P with depth. In addition, there was decreasing BS with depth, driven primarily by declining exchangeable Ca and Mg. Imgation with Snowfluent™ altered the chemical composition of the grass on the site. All element concentrations were significantly higher in the grass foliage except for Ca. The differences were attributed to the additional nutrients and moisture derived from the Snowfluent™. The use of forest spray imgation and Snowfluent™ as a wastewater treatment strategy appears to work well. The soil and vegetation were able to retain most of the applied nutrients, and do not appear to be moving toward saturation. Vegetation management may be a key tool for managing nutrient accumulation on the grass sites as the system ages.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In a marvelous but somewhat neglected paper, 'The Corporation: Will It Be Managed by Machines?' Herbert Simon articulated from the perspective of 1960 his vision of what we now call the New Economy the machine-aided system of production and management of the late twentieth century. Simon's analysis sprang from what I term the principle of cognitive comparative advantage: one has to understand the quite different cognitive structures of humans and machines (including computers) in order to explain and predict the tasks to which each will be most suited. Perhaps unlike Simon's better-known predictions about progress in artificial intelligence research, the predictions of this 1960 article hold up remarkably well and continue to offer important insights. In what follows I attempt to tell a coherent story about the evolution of machines and the division of labor between humans and machines. Although inspired by Simon's 1960 paper, I weave many other strands into the tapestry, from classical discussions of the division of labor to present-day evolutionary psychology. The basic conclusion is that, with growth in the extent of the market, we should see humans 'crowded into' tasks that call for the kinds of cognition for which humans have been equipped by biological evolution. These human cognitive abilities range from the exercise of judgment in situations of ambiguity and surprise to more mundane abilities in spatio-temporal perception and locomotion. Conversely, we should see machines 'crowded into' tasks with a well-defined structure. This conclusion is not based (merely) on a claim that machines, including computers, are specialized idiots-savants today because of the limits (whether temporary or permanent) of artificial intelligence; rather, it rests on a claim that, for what are broadly 'economic' reasons, it will continue to make economic sense to create machines that are idiots-savants.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Schulwerkstätte für Schlosserarbeiten [Illustration] [[Elektronische Ressource]] : (Fotografie)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

"The Relation between Psychology and Sociology in the Work of Wilhelm Dilthey" (GS 4, S. 352-370), veröffentlicht in Studies in Philosophy and Social Science VIII, 1939/40, S. 430-443, Vortragstext englische Fassung, Typoskript mit eigenhändigen Korrekturen, 18 Blatt, deutsche Fassung, Typoskript mit eigenhändigen Korrekturen, 20 Blatt; Vorlage zur Eröffnung der Vortragsveranstaltung, Typoskript, 1 Blatt; Über das Verhältnis Diltheys zu Max Weber; über die Widersprüche bei Dilthey; zur Logik geisteswissenschaftlichen Verstehens (= Vorarbeiten zum Vortrag? Vorbereitungen zu Diskussionsbeiträgen?), a) englische Fassung, Typoskript, 5 Blatt, b) deutsche Fassung, Typoskript, 6 Blatt; N.N.: handschriftliche Notiz für die Diskussion, 1 Blatt; Exzerpte zum Werk Wilhelm Diltheys, Typoskripte, 12 Blatt; Zitate aus Schriften Wilhelm Diltheys, Typoskripte, 12 Blatt; Deutsche Übersetzung des Aufsatzes von Kurt Jürgen Huch und Alfred Schmidt, mit dem Titel: "Der Zusammenhang zwischen Psychologie und Soziologie im Werk Wilhelm Diltheys", veröffentlicht in: Max Horkheimer, "Kritische Theorie", Bd. II, 1968, S.273-291, Typoskript mit handschriftlichen Korrekturen, 29 Blatt; "Autoritärer Staat" (GS 5, S. 293-319), Aufsatz, datiert: Frühjahr 1940, veröffentlicht als vervielfältigtes Typoskript in "Walter Benjamin zum Gedächtnis", herausgegeben vom Institut für Sozialforschung, Los Angeles 1942, S. 123-161;

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The usage of intensity modulated radiotherapy (IMRT) treatments necessitates a significant amount of patient-specific quality assurance (QA). This research has investigated the precision and accuracy of Kodak EDR2 film measurements for IMRT verifications, the use of comparisons between 2D dose calculations and measurements to improve treatment plan beam models, and the dosimetric impact of delivery errors. New measurement techniques and software were developed and used clinically at M. D. Anderson Cancer Center. The software implemented two new dose comparison parameters, the 2D normalized agreement test (NAT) and the scalar NAT index. A single-film calibration technique using multileaf collimator (MLC) delivery was developed. EDR2 film's optical density response was found to be sensitive to several factors: radiation time, length of time between exposure and processing, and phantom material. Precision of EDR2 film measurements was found to be better than 1%. For IMRT verification, EDR2 film measurements agreed with ion chamber results to 2%/2mm accuracy for single-beam fluence map verifications and to 5%/2mm for transverse plane measurements of complete plan dose distributions. The same system was used to quantitatively optimize the radiation field offset and MLC transmission beam modeling parameters for Varian MLCs. While scalar dose comparison metrics can work well for optimization purposes, the influence of external parameters on the dose discrepancies must be minimized. The ability of 2D verifications to detect delivery errors was tested with simulated data. The dosimetric characteristics of delivery errors were compared to patient-specific clinical IMRT verifications. For the clinical verifications, the NAT index and percent of pixels failing the gamma index were exponentially distributed and dependent upon the measurement phantom but not the treatment site. Delivery errors affecting all beams in the treatment plan were flagged by the NAT index, although delivery errors impacting only one beam could not be differentiated from routine clinical verification discrepancies. Clinical use of this system will flag outliers, allow physicists to examine their causes, and perhaps improve the level of agreement between radiation dose distribution measurements and calculations. The principles used to design and evaluate this system are extensible to future multidimensional dose measurements and comparisons. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Public health departments play an important role in promoting and preserving the health of communities. The lack of a system to ensure their quality and accountability led to the development of a national voluntary accreditation program by Public Health Accreditation Board (PHAB). The concept that accreditation will lead to quality improvement in public health which will ultimately lead to healthy communities seems intuitive but lacks a robust body of evidence. A critical review of literature was conducted to explore if accreditation can lead to quality improvement in public health. The articles were selected from publically available databases using a specific set of criteria for inclusion, exclusion, and appraisal. To understand the relationship between accreditation and quality improvement, the potential strengths and limitations of accreditation process were evaluated. Recommendations for best practices are suggested so that public health accreditation can yield maximum benefits. A logic model framework to help depict the impact of accreditation on various levels of public health outcomes is also discussed in this thesis. The literature review shows that existing accreditation programs in other industries show limited but encouraging evidence that accreditation will improve quality and strengthen the delivery of public health services. While progress in introducing accreditation in public health can be informed by other accredited industries, the public health field has its own set of challenges. Providing incentives, creating financing strategies, and having a strong leadership will allow greater access to accreditation by all public health departments. The suggested recommendations include that continuous evaluation, public participation, systems approach, clear vision, and dynamic standards should become hallmarks of the accreditation process. Understanding the link between accreditation, quality improvement, and health outcomes will influence the successful adoption and implementation of the public health accreditation program. This review of literature suggests that accreditation is an important step in improving the quality of public health departments and in ultimately improving the health of communities. However, accreditation should be considered in an integrated system of tools and approaches to improve the public health practice. Hence, it is a means to an end - not an end unto itself.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) is a noninvasive technique for quantitative assessment of the integrity of blood-brain barrier and blood-spinal cord barrier (BSCB) in the presence of central nervous system pathologies. However, the results of DCE-MRI show substantial variability. The high variability can be caused by a number of factors including inaccurate T1 estimation, insufficient temporal resolution and poor contrast-to-noise ratio. My thesis work is to develop improved methods to reduce the variability of DCE-MRI results. To obtain fast and accurate T1 map, the Look-Locker acquisition technique was implemented with a novel and truly centric k-space segmentation scheme. In addition, an original multi-step curve fitting procedure was developed to increase the accuracy of T1 estimation. A view sharing acquisition method was implemented to increase temporal resolution, and a novel normalization method was introduced to reduce image artifacts. Finally, a new clustering algorithm was developed to reduce apparent noise in the DCE-MRI data. The performance of these proposed methods was verified by simulations and phantom studies. As part of this work, the proposed techniques were applied to an in vivo DCE-MRI study of experimental spinal cord injury (SCI). These methods have shown robust results and allow quantitative assessment of regions with very low vascular permeability. In conclusion, applications of the improved DCE-MRI acquisition and analysis methods developed in this thesis work can improve the accuracy of the DCE-MRI results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: The Sacred Vocation Program (SVP) (Amick B, Karff S., 2003) helps workers find meaning, spirituality, and see their job as a sacred vocation. The SVP is based on Participatory Action Research (PAR) (Minkler & Wallerstein, 1997; Parker & Wall, 1998). This study aims to evaluate the SVP implemented at the Baylor Healthcare System, Dallas-Fort Worth. ^ Methods: The study design is a qualitative design. We used data from study participants who have participated in focus groups. During these focus groups specific questions and probes regarding the effectiveness of the SVP have been asked. We analyzed the focus groups and derived themes. ^ Results: Results of this study demonstrate SVP helps graduates feel valued and important. The SVP has improved meaningful work for employees and improved a sense of belonging for participants. The program has also increased participant spirituality. The coping techniques developed during a SVP class helps participants deal with stressful situations. The SVP faces challenges of implementation fidelity, poor communication, program viability in tough economic times and implementation of phase II. Another sustainability challenge for SVP is the perception of the program being a religious one versus a spiritual program. ^ Conclusion: Several aspects of the SVP work. The phase I of SVP is successful in improving meaningful work and a sense of belonging for participants. The coping techniques help participants deal with difficult work situations. The SVP can increase effectiveness through improvements in implementation fidelity, communication and leadership commitment. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: The failure rate of health information systems is high, partially due to fragmented, incomplete, or incorrect identification and description of specific and critical domain requirements. In order to systematically transform the requirements of work into real information system, an explicit conceptual framework is essential to summarize the work requirements and guide system design. Recently, Butler, Zhang, and colleagues proposed a conceptual framework called Work Domain Ontology (WDO) to formally represent users’ work. This WDO approach has been successfully demonstrated in a real world design project on aircraft scheduling. However, as a top level conceptual framework, this WDO has not defined an explicit and well specified schema (WDOS) , and it does not have a generalizable and operationalized procedure that can be easily applied to develop WDO. Moreover, WDO has not been developed for any concrete healthcare domain. These limitations hinder the utility of WDO in real world information system in general and in health information system in particular. Objective: The objective of this research is to formalize the WDOS, operationalize a procedure to develop WDO, and evaluate WDO approach using Self-Nutrition Management (SNM) work domain. Method: Concept analysis was implemented to formalize WDOS. Focus group interview was conducted to capture concepts in SNM work domain. Ontology engineering methods were adopted to model SNM WDO. Part of the concepts under the primary goal “staying healthy” for SNM were selected and transformed into a semi-structured survey to evaluate the acceptance, explicitness, completeness, consistency, experience dependency of SNM WDO. Result: Four concepts, “goal, operation, object and constraint”, were identified and formally modeled in WDOS with definitions and attributes. 72 SNM WDO concepts under primary goal were selected and transformed into semi-structured survey questions. The evaluation indicated that the major concepts of SNM WDO were accepted by 41 overweight subjects. SNM WDO is generally independent of user domain experience but partially dependent on SNM application experience. 23 of 41 paired concepts had significant correlations. Two concepts were identified as ambiguous concepts. 8 extra concepts were recommended towards the completeness of SNM WDO. Conclusion: The preliminary WDOS is ready with an operationalized procedure. SNM WDO has been developed to guide future SNM application design. This research is an essential step towards Work-Centered Design (WCD).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ultraviolet radiation plays a critical role in the induction of non-melanoma skin cancer. UV radiation is also immune suppressive. Moreover, UV-induced systemic immune suppression is a major risk factor for skin cancer induction. Previous work had shown that UV exposure in vivo activates a cytokine cascade involving PGE2, IL-4, and IL-10 that induces immune suppression. However, the earliest molecular events that occur immediately after UV-exposure, especially those upstream of PGE2, were not well defined. To determine the initial events and mediators that lead to immune suppression after a pathological dose of UV, mouse keratinocytes were analyzed after sunlamp irradiation. It is known that UV-irradiated keratinocytes secrete the phospholipid mediator of inflammation, platelet-activating factor (PAF). Since PAF stimulates the production of immunomodulatory compounds, including PGE2, the hypothesis that UV-induced PAF activates cytokine production and initiates UV-induced immune suppression was tested. Both UV and PAF activated the transcription of cyclooxygenase (COX)-2 and IL-10 reporter gene constructs. A PAF receptor antagonist blocked UV-induced IL, 10 and COX-2 transcription. PAF mimicked the effects of UV in vivo and suppressed delayed-type hypersensitivity (DTH), and immune suppression was blocked when UV-irradiated mice were injected with a PAF receptor antagonist. This work shows that UV generates PAF-like oxidized lipids, that signal through the PAF receptor, activate cytokine transcription, and induce systemic immune suppression. ^