816 resultados para structured dependency
Resumo:
TEMPEST is a full-screen text editor that incorporates a structural paradigm in addition to the more traditional textual paradigm provided by most editors. While the textual paradigm treats the text as a sequence of characters, the structural paradigm treats it as a collection of named blocks which the user can define, group, and manipulate. Blocks can be defined to correspond to the structural features of he text, thereby providing more meaningful objects to operate on than characters of lines. The structural representation of the text is kept in the background, giving TEMPEST the appearance of a typical text editor. The structural and textual interfaces coexist equally, however, so one can always operate on the text from wither point of view. TEMPEST's representation scheme provides no semantic understanding of structure. This approach sacrifices depth, but affords a broad range of applicability and requires very little computational overhead. A prototype has been implemented to illustrate the feasibility and potential areas of application of the central ideas. It was developed and runs on an IBM Personal Computer.
Resumo:
C.H. Orgill, N.W. Hardy, M.H. Lee, and K.A.I. Sharpe. An application of a multiple agent system for flexible assemble tasks. In Knowledge based envirnments for industrial applications including cooperating expert systems in control. IEE London, 1989.
Resumo:
John Warren and Chris Topping (2004). A trait specific model of competition in a spatially structured plant community. Ecological Modelling, 180 pp.477-485 RAE2008
Resumo:
This article examines some preliminary tests which were performed in order to evaluate the best electrode configuration (width and spacing) for cell culture analyses. Biochips packaged with indium tin oxide (ITO) interdigitated electrodes (IDEs) were used to perform impedance measurements on A549 cells cultured on the surface of the biochip. Several tests were carried out using a 10 mM solution of Sodium Chloride (NaCl), cell medium and the cell culture itself to characterize some of the configurations already fabricated in the facilities at Tyndall National Institute.
Resumo:
Ireland experienced two critical junctures when its economic survival was threatened: 1958/9 and 1986/7. Common to both crises was the supplanting of long established practices, that had become an integral part of the political culture of the state, by new ideas that ensured eventual economic recovery. In their adoption and implementation these ideas also fundamentally changed the institutions of state – how politics was done, how it was organised and regulated. The end result was the transformation of the Irish state. The main hypothesis of this thesis is that at those critical junctures the political and administrative elites who enabled economic recovery were not just making pragmatic decisions, their actions were influenced by ideas. Systematic content analysis of the published works of the main ideational actors, together with primary interviews with those actors still alive, reveals how their ideas were formed, what influenced them, and how they set about implementing their ideas. As the hypothesis assumes institutional change over time historical institutionalism serves as the theoretical framework. Central to this theory is the idea that choices made when a policy is being initiated or an institution formed will have a continuing influence long into the future. Institutions of state become ‘path dependent’ and impervious to change – the forces of inertia take over. That path dependency is broken at critical junctures. At those moments ideas play a major role as they offer a set of ready-made solutions. Historical institutionalism serves as a robust framework for proving that in the transformation of Ireland the role of ideas in punctuating institutional path dependency at critical junctures was central.
Resumo:
Introduction: Older individuals are particularly vulnerable to potentially inappropriate prescribing (PIP), drug related problems (DRPs) and adverse drug reactions (ADRs). A number of different interventions have been proposed to address these issues. However to-date there is a paucity of well-designed trials examining the impact of such interventions. Therefore the aims of this work were to: (i) establish a baseline PIP prevalence both nationally and internationally using the STOPP, Beers and PRISCUS criteria, (ii) identify the most comprehensive method of assessing PIP in older individuals, (iii) develop a structured pharmacist intervention supported by a computer decisions support system (CDSS) and (iv) examine the impact of this intervention on prescribing and incidence of ADRs. Results: This work identified high rates of PIP across all three healthcare settings in Ireland, 84.7% in the long term care, 70.7% in secondary care and 43.3% in primary care being reported. This work identified that for a comprehensive assessment of prescribing to be undertaken, an amalgamation of all three criteria should be deployed simultaneously. High prevalences of DRPs and PIP in older hospitalised individuals were identified. With 82.0% and 76.3% of patients reported to have at least one DRP or PIP instance respectively. The structured pharmacist intervention demonstrated a positive impact on prescribing, with a significant reduction MAI scores being reported. It also resulted in the intervention patients’ having a reduced risk of experiencing an ADR when compared to the control patients (absolute risk reduction of 6.8 (95% CI 1.5% - 12.3%)) and the number needed to treat = 15 (95% CI 8 - 68). However the intervention was found to have no significant effect on length of stay or mortality rate. Conclusion: This work shows that PIP is highly prevalent in older individuals across three healthcare settings in Ireland. This work also demonstrates that a structured pharmacist intervention support by a dedicated CDSS can significantly improve the appropriateness of prescribing and reduce the incidence of ADRs in older acutely ill hospitalised individuals.
Resumo:
This research investigates some of the reasons for the reported difficulties experienced by writers when using editing software designed for structured documents. The overall objective was to determine if there are aspects of the software interfaces which militate against optimal document construction by writers who are not computer experts, and to suggest possible remedies. Studies were undertaken to explore the nature and extent of the difficulties, and to identify which components of the software interfaces are involved. A model of a revised user interface was tested, and some possible adaptations to the interface are proposed which may help overcome the difficulties. The methodology comprised: 1. identification and description of the nature of a ‘structured document’ and what distinguishes it from other types of document used on computers; 2. isolation of the requirements of users of such documents, and the construction a set of personas which describe them; 3. evaluation of other work on the interaction between humans and computers, specifically in software for creating and editing structured documents; 4. estimation of the levels of adoption of the available software for editing structured documents and the reactions of existing users to it, with specific reference to difficulties encountered in using it; 5. examination of the software and identification of any mismatches between the expectations of users and the facilities provided by the software; 6. assessment of any physical or psychological factors in the reported difficulties experienced, and to determine what (if any) changes to the software might affect these. The conclusions are that seven of the twelve modifications tested could contribute to an improvement in usability, effectiveness, and efficiency when writing structured text (new document selection; adding new sections and new lists; identifying key information typographically; the creation of cross-references and bibliographic references; and the inclusion of parts of other documents). The remaining five were seen as more applicable to editing existing material than authoring new text (adding new elements; splitting and joining elements [before and after]; and moving block text).
Resumo:
Flavour release from food is determined by the binding of flavours to other food ingredients and the partition of flavour molecules among different phases. Food emulsions are used as delivery systems for food flavours, and tailored structuring in emulsions provides novel means to better control flavour release. The current study investigated four structured oil-in-water emulsions with structuring in the oil phase, oil-water interface, and water phase. Oil phase structuring was achieved by the formation of monoglyceride (MG) liquid crystals in the oil droplets (MG structured emulsions). Structured interface was created by the adsorption of a whey protein isolate (WPI)-pectin double layer at the interface (multilayer emulsion). Water phase structured emulsions referred to emulsion filled protein gels (EFP gels), where emulsion droplets were embedded in WPI gel network, and emulsions with maltodextrins (MDs) of different dextrose-equivalent (DE) values. Flavour compounds with different physicochemical properties were added into the emulsions, and flavour release (release rate, headspace concentration and air-emulsion partition coefficient) was described by GC headspace analysis. Emulsion structures, including crystalline structure, particle size, emulsion stability, rheology, texture, and microstructures, were characterized using differential scanning calorimetry and X-ray diffraction, light scattering, multisample analytical centrifuge, rheometry, texture analysis, and confocal laser scanning microscopy, respectively. In MG structured emulsions, MG self-assembled into liquid crystalline structures and stable β-form crystals were formed after 3 days of storage at 25 °C. The inclusion of MG crystals allowed tween 20 stabilized emulsions to present viscoelastic properties, and it made WPI stabilized emulsions more sensitive to the change of pH and NaCl concentrations. Flavour compounds in MG structured emulsions had lower initial headspace concentration and air-emulsion partition coefficients than those in unstructured emulsions. Flavour release can be modulated by changing MG content, oil content and oil type. WPI-pectin multilayer emulsions were stable at pH 5.0, 4.0, and 3.0, but they presented extensive creaming when subjected to salt solutions with NaCl ≥ 150 mM and mixed with artificial salivas. Increase of pH from 5.0 to 7.0 resulted in higher headspace concentration but unchanged release rate, and increase of NaCl concentration led to increased headspace concentration and release rate. The study also showed that salivas could trigger higher release of hydrophobic flavours and lower release of hydrophilic flavours. In EFP gels, increases in protein content and oil content contributed to gels with higher storage modulus and force at breaking. Flavour compounds had significantly reduced release rates and air-emulsion partition coefficients in the gels than the corresponding ungelled emulsions, and the reduction was in line with the increase of protein content. Gels with stronger gel network but lower oil content were prepared, and lower or unaffected release rates of the flavours were observed. In emulsions containing maltodextrins, water was frozen at a much lower temperature, and emulsion stability was greatly improved when subjected to freeze-thawing. Among different MDs, MD DE 6 offered the emulsion the highest stability. Flavours had lower air-emulsion partition coefficients in the emulsions with MDs than those in the emulsion without MD. Moreover, the involvement of MDs in the emulsions allowed most flavours had similar release profiles before and after freeze-thaw treatment. The present study provided information about different structured emulsions as delivery systems for flavour compounds, and on how food structure can be designed to modulate flavour release, which could be helpful in the development of functional foods with improved flavour profile.
Resumo:
We consider the problem of variable selection in regression modeling in high-dimensional spaces where there is known structure among the covariates. This is an unconventional variable selection problem for two reasons: (1) The dimension of the covariate space is comparable, and often much larger, than the number of subjects in the study, and (2) the covariate space is highly structured, and in some cases it is desirable to incorporate this structural information in to the model building process. We approach this problem through the Bayesian variable selection framework, where we assume that the covariates lie on an undirected graph and formulate an Ising prior on the model space for incorporating structural information. Certain computational and statistical problems arise that are unique to such high-dimensional, structured settings, the most interesting being the phenomenon of phase transitions. We propose theoretical and computational schemes to mitigate these problems. We illustrate our methods on two different graph structures: the linear chain and the regular graph of degree k. Finally, we use our methods to study a specific application in genomics: the modeling of transcription factor binding sites in DNA sequences. © 2010 American Statistical Association.
Resumo:
The authors of this study evaluated a structured 10-session psychosocial support group intervention for newly HIV-diagnosed pregnant South African women. Participants were expected to display increases in HIV disclosure, self-esteem, active coping and positive social support, and decreases in depression, avoidant coping, and negative social support. Three hundred sixty-one pregnant HIV-infected women were recruited from four antenatal clinics in Tshwane townships from April 2005 to September 2006. Using a quasi-experimental design, assessments were conducted at baseline and two and eight months post-intervention. A series of random effects regression analyses were conducted, with the three assessment points treated as a random effect of time. At both follow-ups, the rate of disclosure in the intervention group was significantly higher than that of the comparison group (p<0.001). Compared to the comparison group at the first follow-up, the intervention group displayed higher levels of active coping (t=2.68, p<0.05) and lower levels of avoidant coping (t=-2.02, p<0.05), and those who attended at least half of the intervention sessions exhibited improved self-esteem (t=2.11, p<0.05). Group interventions tailored for newly HIV positive pregnant women, implemented in resource-limited settings, may accelerate the process of adjusting to one's HIV status, but may not have sustainable benefits over time.
Resumo:
Intraoperative assessment of surgical margins is critical to ensuring residual tumor does not remain in a patient. Previously, we developed a fluorescence structured illumination microscope (SIM) system with a single-shot field of view (FOV) of 2.1 × 1.6 mm (3.4 mm2) and sub-cellular resolution (4.4 μm). The goal of this study was to test the utility of this technology for the detection of residual disease in a genetically engineered mouse model of sarcoma. Primary soft tissue sarcomas were generated in the hindlimb and after the tumor was surgically removed, the relevant margin was stained with acridine orange (AO), a vital stain that brightly stains cell nuclei and fibrous tissues. The tissues were imaged with the SIM system with the primary goal of visualizing fluorescent features from tumor nuclei. Given the heterogeneity of the background tissue (presence of adipose tissue and muscle), an algorithm known as maximally stable extremal regions (MSER) was optimized and applied to the images to specifically segment nuclear features. A logistic regression model was used to classify a tissue site as positive or negative by calculating area fraction and shape of the segmented features that were present and the resulting receiver operator curve (ROC) was generated by varying the probability threshold. Based on the ROC curves, the model was able to classify tumor and normal tissue with 77% sensitivity and 81% specificity (Youden's index). For an unbiased measure of the model performance, it was applied to a separate validation dataset that resulted in 73% sensitivity and 80% specificity. When this approach was applied to representative whole margins, for a tumor probability threshold of 50%, only 1.2% of all regions from the negative margin exceeded this threshold, while over 14.8% of all regions from the positive margin exceeded this threshold.
Resumo:
Computer Aided Parallelisation Tools (CAPTools) is a toolkit designed to automate as much as possible of the process of parallelising scalar FORTRAN 77 codes. The toolkit combines a very powerful dependence analysis together with user supplied knowledge to build an extremely comprehensive and accurate dependence graph. The initial version has been targeted at structured mesh computational mechanics codes (eg. heat transfer, Computational Fluid Dynamics (CFD)) and the associated simple mesh decomposition paradigm is utilised in the automatic code partition, execution control mask generation and communication call insertion. In this, the first of a series of papers [1–3] the authors discuss the parallelisations of a number of case study codes showing how the various component tools may be used to develop a highly efficient parallel implementation in a few hours or days. The details of the parallelisation of the TEAMKE1 CFD code are described together with the results of three other numerical codes. The resulting parallel implementations are then tested on workstation clusters using PVM and an i860-based parallel system showing efficiencies well over 80%.
Resumo:
Parallel computing is now widely used in numerical simulation, particularly for application codes based on finite difference and finite element methods. A popular and successful technique employed to parallelize such codes onto large distributed memory systems is to partition the mesh into sub-domains that are then allocated to processors. The code then executes in parallel, using the SPMD methodology, with message passing for inter-processor interactions. In order to improve the parallel efficiency of an imbalanced structured mesh CFD code, a new dynamic load balancing (DLB) strategy has been developed in which the processor partition range limits of just one of the partitioned dimensions uses non-coincidental limits, as opposed to coincidental limits. The ‘local’ partition limit change allows greater flexibility in obtaining a balanced load distribution, as the workload increase, or decrease, on a processor is no longer restricted by the ‘global’ (coincidental) limit change. The automatic implementation of this generic DLB strategy within an existing parallel code is presented in this chapter, along with some preliminary results.
Resumo:
The most common parallelisation strategy for many Computational Mechanics (CM) (typified by Computational Fluid Dynamics (CFD) applications) which use structured meshes, involves a 1D partition based upon slabs of cells. However, many CFD codes employ pipeline operations in their solution procedure. For parallelised versions of such codes to scale well they must employ two (or more) dimensional partitions. This paper describes an algorithmic approach to the multi-dimensional mesh partitioning in code parallelisation, its implementation in a toolkit for almost automatically transforming scalar codes to parallel form, and its testing on a range of ‘real-world’ FORTRAN codes. The concept of multi-dimensional partitioning is straightforward, but non-trivial to represent as a sufficiently generic algorithm so that it can be embedded in a code transformation tool. The results of the tests on fine real-world codes demonstrate clear improvements in parallel performance and scalability (over a 1D partition). This is matched by a huge reduction in the time required to develop the parallel versions when hand coded – from weeks/months down to hours/days.
Resumo:
The purpose of this article is to gain an insight into the effects of practicing short, frequent,and structured reflection breaks interspersed with the learning material in a computer-based course. To that end, the study sets up a standardized control trial with two groups of secondary school pupils. The study shows that while performance is not affected by these embedded “reflection rituals,” they significantly impact time on task and perceived learning. The study also suggests that the exposure to such built-in opportunities for reflection modifies the engagement with the content and fosters the claimed readiness for application of a similar reflective approach to learning in other occasions.