850 resultados para Open and closed shop.
Resumo:
We describe the occurrence of non-marine bivalves in exposures of the Middle Permian (Capitanian) Brenton Loch Formation on the southern shore of Choiseul Sound, East Falklands. The bivalves are associated with ichnofossils and were collected from a bed in the upper part of the formation, within a 25 cm thick interval of dark siltstones and mudstones with planar lamination, overlain by massive sandstones. The shells are articulated, with the valves either splayed open or closed. At the top of the succession, mudstone beds nearly 1.5 m above the bivalve-bearing layers yielded well-preserved Glossopteris sp. cf. G. communis leaf fossils. The closed articulated condition of some shells indicates preservation under high sedimentation rates with low residence time of bioclasts at the sediment/water interface. However, the presence of specimens with splayed shells is usually correlated to the slow decay of the shell ligament in oxygen-deficient bottom waters. The presence of complete carbonized leaves of Glossopteris associated with the bivalve-bearing levels also suggests a possibly dysoxic-anoxic bottom environment. Overall, our data suggest that the bivalves were preserved by abrupt burial, possibly by distal sediment flows into a Brenton Loch lake, and may represent autochthonous to parautochthonous fossil accumulations. The shells resemble those of anthracosiids and are herein assigned to Palaeanodonta sp. aff. P. dubia, a species also found in the Permian succession of the Karoo Basin, South Africa. Our results confirm that (a) the true distributions in space and time of all Permian non-marine (freshwater) bivalves are not yet well known, and (b) there is no evidence for marine conditions in the upper part of the Brenton Loch Formation.
Resumo:
This study aimed to verify the influence of the transport in open or closed compartments (0 h), followed by two resting periods (1 and 3 h) for the slaughter process on the levels of cortisol as a indicative of stress level. At the slaughterhouse, blood samples were taken from 86 lambs after the transport and before slaughter for plasma cortisol analysis. The method of transport influenced in the cortisol concentration (0 h; P < 0.01). The animals transported in the closed compartment had a lower level (28.97 ng ml(-1)) than the animals transported in the open compartment (35.49 ng ml(-1)). After the resting period in the slaughterhouse. there was a decline in the plasmatic cortisol concentration, with the animals subjected to 3 h of rest presenting the lower average cortisol value (24.14 ng ml(-1); P < 0.05) than animals subjected to 1 h of rest (29.95 ng ml(-1)). It can be inferred that the lambs that remained 3 h in standby before slaughter had more time to recover from the stress of the transportation than those that waited just 1 h. Visual access to the external environment during the transport of the lambs is a stressful factor changing the level of plasmatic cortisol, and the resting period before slaughter was effective in lowering stress, reducing the plasmatic cortisol in the lambs. (c) 2012 Elsevier B.V. All rights reserved.
Resumo:
Previous work has shown that the -tocopherol transfer protein ( -TTP) can bind to vesicular or immobilized phospholipid membranes. Revealing the molecular mechanisms by which -TTP associates with membranes is thought to be critical to understanding its function and role in the secretion of tocopherol from hepatocytes into the circulation. Calculations presented in the Orientations of Proteins in Membranes database have provided a testable model for the spatial arrangement of -TTP and other CRAL-TRIO family proteins with respect to the lipid bilayer. These calculations predicted that a hydrophobic surface mediates the interaction of -TTP with lipid membranes. To test the validity of these predictions, we used site-directed mutagenesis and examined the substituted mutants with regard to intermembrane ligand transfer, association with lipid layers and biological activity in cultured hepatocytes. Substitution of residues in helices A8 (F165A and F169A) and A10 (I202A, V206A and M209A) decreased the rate of intermembrane ligand transfer as well as protein adsorption to phospholipid bilayers. The largest impairment was observed upon mutation of residues that are predicted to be fully immersed in the lipid bilayer in both apo (open) and holo (closed) conformations such as Phe165 and Phe169. Mutation F169A, and especially F169D, significantly impaired -TTP-assisted secretion of -tocopherol outside cultured hepatocytes. Mutation of selected basic residues (R192H, K211A, and K217A) had little effect on transfer rates, indicating no significant involvement of nonspecific electrostatic interactions with membranes.
Resumo:
Neural correlates of electroencephalographic (EEG) alpha rhythm are poorly understood. Here, we related EEG alpha rhythm in awake humans to blood-oxygen-level-dependent (BOLD) signal change determined by functional magnetic resonance imaging (fMRI). Topographical EEG was recorded simultaneously with fMRI during an open versus closed eyes and an auditory stimulation versus silence condition. EEG was separated into spatial components of maximal temporal independence using independent component analysis. Alpha component amplitudes and stimulus conditions served as general linear model regressors of the fMRI signal time course. In both paradigms, EEG alpha component amplitudes were associated with BOLD signal decreases in occipital areas, but not in thalamus, when a standard BOLD response curve (maximum effect at approximately 6 s) was assumed. The part of the alpha regressor independent of the protocol condition, however, revealed significant positive thalamic and mesencephalic correlations with a mean time delay of approximately 2.5 s between EEG and BOLD signals. The inverse relationship between EEG alpha amplitude and BOLD signals in primary and secondary visual areas suggests that widespread thalamocortical synchronization is associated with decreased brain metabolism. While the temporal relationship of this association is consistent with metabolic changes occurring simultaneously with changes in the alpha rhythm, sites in the medial thalamus and in the anterior midbrain were found to correlate with short time lag. Assuming a canonical hemodynamic response function, this finding is indicative of activity preceding the actual EEG change by some seconds.
Resumo:
Within the past 15 years, significant advances in the imaging of multiorgan and complex trauma primarily due to the improvement of cross-sectional imaging have resulted in the optimization of the expedient diagnosis and management of the polytrauma patient. At the forefront, multidetector computed tomography (MDCT) has become the cornerstone of modern emergency departments and trauma centers. In many institutions, MDCT is the de facto diagnostic tool upon trauma activation. In the setting of pelvic imaging, MDCT (with its high spatial resolution and sensitivity as well as short acquisition times) allows for rapid identification and assessment of pelvic hemorrhage leading to faster triage and definitive management. In trauma centers throughout the world, angiography and minimally invasive catheter-based embolization techniques performed by interventional radiologists have become the standard of care for patients with acute pelvic trauma and related multiorgan hemorrhage. In an interdisciplinary setting, embolization may be performed either alone or as an adjunct procedure with open or closed reduction and stabilization techniques. A team-based approach involving multiple disciplines (e.g., radiology, traumatology, orthopedic surgery, intensive care medicine) is crucial to monitor and treat the actively bleeding patient appropriately.
Resumo:
Background. Human trafficking, or "modern day slavery", is a complex public health issue that we must understand more fully before it can be effectively tackled. There have been very few empirical studies on human trafficking and estimates of global and national human trafficking victims vary widely. Free the Slaves, a non-profit organization, estimates that there are at least 27 million people in the world at any given time that can be classified as victims of human trafficking. Houston, Texas has been identified as a place where human trafficking may be more likely to exist due to its close proximity to Mexico and due to economic and population factors. Houston Rescue and Restore Coalition (HRRC) is a local organization that exists to raise awareness of human trafficking in Houston, Texas. To better serve victims of human trafficking, HRRC commissioned a community assessment of the services available to victims of human trafficking in the greater Houston metropolitan area.^ Purpose. The current study assessed the capacity of organizations and agencies within the greater Houston metropolitan area to deal with human trafficking issues; in particular, knowledge regarding human trafficking issues among these organizations and agencies was evaluated.^ Methods. A cross-sectional study design was used to conduct surveys with key informants/stakeholders from organizations and agencies within the greater Houston metropolitan area. The survey instrument included 41 items in three parts, and consisted of multiple choice questions, open-ended essay questions, and closed-ended 5 point Likert questions.^ Results. The findings from this study indicate that efforts must be made to increase comprehensive awareness of the issue of human trafficking, including the federal and state laws that have been enacted to combat this problem. The data also indicate that there are limited services provided to human trafficking victims within the greater Houston metropolitan area.^ Conclusion. The results of the survey will provide Houston Rescue and Restore Coalition with information that will assist them in targeting their efforts to combat human trafficking in Houston, Texas.^
Resumo:
Over a 2-year study, we investigated the effect of environmental change on the diversity and abundance of soil arthropod communities (Acari and Collembola) in the Maritime Antarctic and the Falkland Islands. Open Top Chambers (OTCs), as used extensively in the framework of the northern boreal International Tundra Experiment (ITEX), were used to increase the temperature in contrasting communities on three islands along a latitudinal temperature gradient, ranging from the Falkland Islands (51°S, mean annual temperature 7.5 °C) to Signy Island (60°S, -2.3°C) and Anchorage Island (67°S, -3.8°C). At each island an open and a closed plant community were studied: lichen vs. moss at the Antarctic sites, and grass vs. dwarf shrub at the Falkland Islands. The OTCs raised the soil surface temperature during most months of the year. During the summer the level of warming achieved was 1.7 °C at the Falkland Islands, 0.7 °C at Signy Island, and 1.1 °C at Anchorage Island. The native arthropod community diversity decreased with increasing latitude. In contrast with this pattern, Collembola abundance in the closed vegetation (dwarf shrub or moss) communities increased by at least an order of magnitude from the Falkland Islands (9.0 +/- 2 x 10**3 ind./m**2) to Signy (3.3 +/- 8.0 x 10**4 ind./m**2) and Anchorage Island (3.1 +/- 0.82 x 10**5 ind./m**2). The abundance of Acari did not show a latitudinal trend. Abundance and diversity of Acari and Collembola were unaffected by the warming treatment on the Falkland Islands and Anchorage Island. However, after two seasons of experimental warming, the total abundance of Collembola decreased (p < 0.05) in the lichen community on Signy Island as a result of the population decline of the isotomid Cryptopygus antarcticus. In the same lichen community there was also a decline (p < 0.05) of the mesostigmatid predatory mite Gamasellus racovitzai, and a significant increase in the total number of Prostigmata. Overall, our data suggest that the consequences of an experimental temperature increase of 1-2°C, comparable to the magnitude currently seen through recent climate change in the Antarctic Peninsula region, on soil arthropod communities in this region may not be similar for each location but is most likely to be small and initially slow to develop.
Resumo:
Hybrid Stepper Motors are widely used in open-loop position applications. They are the choice of actuation for the collimators in the Large Hadron Collider, the largest particle accelerator at CERN. In this case the positioning requirements and the highly radioactive operating environment are unique. The latter forces both the use of long cables to connect the motors to the drives which act as transmission lines and also prevents the use of standard position sensors. However, reliable and precise operation of the collimators is critical for the machine, requiring the prevention of step loss in the motors and maintenance to be foreseen in case of mechanical degradation. In order to make the above possible, an approach is proposed for the application of an Extended Kalman Filter to a sensorless stepper motor drive, when the motor is separated from its drive by long cables. When the long cables and high frequency pulse width modulated control voltage signals are used together, the electrical signals difer greatly between the motor and drive-side of the cable. Since in the considered case only drive-side data is available, it is therefore necessary to estimate the motor-side signals. Modelling the entire cable and motor system in an Extended Kalman Filter is too computationally intensive for standard embedded real-time platforms. It is, in consequence, proposed to divide the problem into an Extended Kalman Filter, based only on the motor model, and separated motor-side signal estimators, the combination of which is less demanding computationally. The efectiveness of this approach is shown in simulation. Then its validity is experimentally demonstrated via implementation in a DSP based drive. A testbench to test its performance when driving an axis of a Large Hadron Collider collimator is presented along with the results achieved. It is shown that the proposed method is capable of achieving position and load torque estimates which allow step loss to be detected and mechanical degradation to be evaluated without the need for physical sensors. These estimation algorithms often require a precise model of the motor, but the standard electrical model used for hybrid stepper motors is limited when currents, which are high enough to produce saturation of the magnetic circuit, are present. New model extensions are proposed in order to have a more precise model of the motor independently of the current level, whilst maintaining a low computational cost. It is shown that a significant improvement in the model It is achieved with these extensions, and their computational performance is compared to study the cost of model improvement versus computation cost. The applicability of the proposed model extensions is demonstrated via their use in an Extended Kalman Filter running in real-time for closed-loop current control and mechanical state estimation. An additional problem arises from the use of stepper motors. The mechanics of the collimators can wear due to the abrupt motion and torque profiles that are applied by them when used in the standard way, i.e. stepping in open-loop. Closed-loop position control, more specifically Field Oriented Control, would allow smoother profiles, more respectful to the mechanics, to be applied but requires position feedback. As mentioned already, the use of sensors in radioactive environments is very limited for reliability reasons. Sensorless control is a known option but when the speed is very low or zero, as is the case most of the time for the motors used in the LHC collimator, the loss of observability prevents its use. In order to allow the use of position sensors without reducing the long term reliability of the whole system, the possibility to switch from closed to open loop is proposed and validated, allowing the use of closed-loop control when the position sensors function correctly and open-loop when there is a sensor failure. A different approach to deal with the switched drive working with long cables is also presented. Switched mode stepper motor drives tend to have poor performance or even fail completely when the motor is fed through a long cable due to the high oscillations in the drive-side current. The design of a stepper motor output fillter which solves this problem is thus proposed. A two stage filter, one devoted to dealing with the diferential mode and the other with the common mode, is designed and validated experimentally. With this ?lter the drive performance is greatly improved, achieving a positioning repeatability even better than with the drive working without a long cable, the radiated emissions are reduced and the overvoltages at the motor terminals are eliminated.
Resumo:
Acoplamiento del sistema informático de control de piso de producción (SFS) con el conjunto de equipos de fabricación (SPE) es una tarea compleja. Tal acoplamiento involucra estándares abiertos y propietarios, tecnologías de información y comunicación, entre otras herramientas y técnicas. Debido a la turbulencia de mercados, ya sea soluciones personalizadas o soluciones basadas en estándares eventualmente requieren un esfuerzo considerable de adaptación. El concepto de acoplamiento débil ha sido identificado en la comunidad de diseño organizacional como soporte para la sobrevivencia de la organización. Su presencia reduce la resistencia de la organización a cambios en el ambiente. En este artículo los resultados obtenidos por la comunidad de diseño organizacional son identificados, traducidos y organizados para apoyar en la solución del problema de integración SFS-SPE. Un modelo clásico de acoplamiento débil, desarrollado por la comunidad de estudios de diseño organizacional, es resumido y trasladado al área de interés. Los aspectos claves son identificados para utilizarse como promotores del acoplamiento débil entre SFS-SPE, y presentados en forma de esquema de referencia. Así mismo, este esquema de referencia es presentado como base para el diseño e implementación de una solución genérica de acoplamiento o marco de trabajo (framework) de acoplamiento, a incluir como etapa de acoplamiento débil entre SFS y SPE. Un ejemplo de validación con varios conjuntos de equipos de fabricación, usando diferentes medios físicos de comunicación, comandos de controlador, lenguajes de programación de equipos y protocolos de comunicación es presentado, mostrando un nivel aceptable de autonomía del SFS. = Coupling shop floor software system (SFS) with the set of production equipment (SPE) becomes a complex task. It involves open and proprietary standards, information and communication technologies among other tools and techniques. Due to market turbulence, either custom solutions or standards based solutions eventually require a considerable effort of adaptation. Loose coupling concept has been identified in the organizational design community as a compensator for organization survival. Its presence reduces organization reaction to environment changes. In this paper the results obtained by the organizational de sign community are identified, translated and organized to support the SFS-SPE integration problem solution. A classical loose coupling model developed by organizational studies community is abstracted and translated to the area of interest. Key aspects are identified to be used as promoters of SFS-SPE loose coupling and presented in a form of a reference scheme. Furthermore, this reference scheme is proposed here as a basis for the design and implementation of a generic coupling solution or coupling framework, that is included as a loose coupling stage between SFS and SPE. A validation example with various sets of manufacturing equipment, using different physical communication media, controller commands, programming languages and wire protocols is presented, showing an acceptable level of autonomy gained by the SFS.
Resumo:
El auge y penetración de las nuevas tecnologías junto con la llamada Web Social están cambiando la forma en la que accedemos a la medicina. Cada vez más pacientes y profesionales de la medicina están creando y consumiendo recursos digitales de contenido clínico a través de Internet, surgiendo el problema de cómo asegurar la fiabilidad de estos recursos. Además, un nuevo concepto está apareciendo, el de pervasive healthcare o sanidad ubicua, motivado por pacientes que demandan un acceso a los servicios sanitarios en todo momento y en todo lugar. Este nuevo escenario lleva aparejado un problema de confianza en los proveedores de servicios sanitarios. Las plataformas de eLearning se están erigiendo como paradigma de esta nueva Medicina 2.0 ya que proveen un servicio abierto a la vez que controlado/supervisado a recursos digitales, y facilitan las interacciones y consultas entre usuarios, suponiendo una buena aproximación para esta sanidad ubicua. En estos entornos los problemas de fiabilidad y confianza pueden ser solventados mediante la implementación de mecanismos de recomendación de recursos y personas de manera confiable. Tradicionalmente las plataformas de eLearning ya cuentan con mecanismos de recomendación, si bien están más enfocados a la recomendación de recursos. Para la recomendación de usuarios es necesario acudir a mecanismos más elaborados como son los sistemas de confianza y reputación (trust and reputation) En ambos casos, tanto la recomendación de recursos como el cálculo de la reputación de los usuarios se realiza teniendo en cuenta criterios principalmente subjetivos como son las opiniones de los usuarios. En esta tesis doctoral proponemos un nuevo modelo de confianza y reputación que combina evaluaciones automáticas de los recursos digitales en una plataforma de eLearning, con las opiniones vertidas por los usuarios como resultado de las interacciones con otros usuarios o después de consumir un recurso. El enfoque seguido presenta la novedad de la combinación de una parte objetiva con otra subjetiva, persiguiendo mitigar el efecto de posibles castigos subjetivos por parte de usuarios malintencionados, a la vez que enriquecer las evaluaciones objetivas con información adicional acerca de la capacidad pedagógica del recurso o de la persona. El resultado son recomendaciones siempre adaptadas a los requisitos de los usuarios, y de la máxima calidad tanto técnica como educativa. Esta nueva aproximación requiere una nueva herramienta para su validación in-silico, al no existir ninguna aplicación que permita la simulación de plataformas de eLearning con mecanismos de recomendación de recursos y personas, donde además los recursos sean evaluados objetivamente. Este trabajo de investigación propone pues una nueva herramienta, basada en el paradigma de programación orientada a agentes inteligentes para el modelado de comportamientos complejos de usuarios en plataformas de eLearning. Además, la herramienta permite también la simulación del funcionamiento de este tipo de entornos dedicados al intercambio de conocimiento. La evaluación del trabajo propuesto en este documento de tesis se ha realizado de manera iterativa a lo largo de diferentes escenarios en los que se ha situado al sistema frente a una amplia gama de comportamientos de usuarios. Se ha comparado el rendimiento del modelo de confianza y reputación propuesto frente a dos modos de recomendación tradicionales: a) utilizando sólo las opiniones subjetivas de los usuarios para el cálculo de la reputación y por extensión la recomendación; y b) teniendo en cuenta sólo la calidad objetiva del recurso sin hacer ningún cálculo de reputación. Los resultados obtenidos nos permiten afirmar que el modelo desarrollado mejora la recomendación ofrecida por las aproximaciones tradicionales, mostrando una mayor flexibilidad y capacidad de adaptación a diferentes situaciones. Además, el modelo propuesto es capaz de asegurar la recomendación de nuevos usuarios entrando al sistema frente a la nula recomendación para estos usuarios presentada por el modo de recomendación predominante en otras plataformas que basan la recomendación sólo en las opiniones de otros usuarios. Por último, el paradigma de agentes inteligentes ha probado su valía a la hora de modelar plataformas virtuales complejas orientadas al intercambio de conocimiento, especialmente a la hora de modelar y simular el comportamiento de los usuarios de estos entornos. La herramienta de simulación desarrollada ha permitido la evaluación del modelo de confianza y reputación propuesto en esta tesis en una amplia gama de situaciones diferentes. ABSTRACT Internet is changing everything, and this revolution is especially present in traditionally offline spaces such as medicine. In recent years health consumers and health service providers are actively creating and consuming Web contents stimulated by the emergence of the Social Web. Reliability stands out as the main concern when accessing the overwhelming amount of information available online. Along with this new way of accessing the medicine, new concepts like ubiquitous or pervasive healthcare are appearing. Trustworthiness assessment is gaining relevance: open health provisioning systems require mechanisms that help evaluating individuals’ reputation in pursuit of introducing safety to these open and dynamic environments. Technical Enhanced Learning (TEL) -commonly known as eLearning- platforms arise as a paradigm of this Medicine 2.0. They provide an open while controlled/supervised access to resources generated and shared by users, enhancing what it is being called informal learning. TEL systems also facilitate direct interactions amongst users for consultation, resulting in a good approach to ubiquitous healthcare. The aforementioned reliability and trustworthiness problems can be faced by the implementation of mechanisms for the trusted recommendation of both resources and healthcare services providers. Traditionally, eLearning platforms already integrate recommendation mechanisms, although this recommendations are basically focused on providing an ordered classifications of resources. For users’ recommendation, the implementation of trust and reputation systems appears as the best solution. Nevertheless, both approaches base the recommendation on the information from the subjective opinions of other users of the platform regarding the resources or the users. In this PhD work a novel approach is presented for the recommendation of both resources and users within open environments focused on knowledge exchange, as it is the case of TEL systems for ubiquitous healthcare. The proposed solution adds the objective evaluation of the resources to the traditional subjective personal opinions to estimate the reputation of the resources and of the users of the system. This combined measure, along with the reliability of that calculation, is used to provide trusted recommendations. The integration of opinions and evaluations, subjective and objective, allows the model to defend itself against misbehaviours. Furthermore, it also allows ‘colouring’ cold evaluation values by providing additional quality information such as the educational capacities of a digital resource in an eLearning system. As a result, the recommendations are always adapted to user requirements, and of the maximum technical and educational quality. To our knowledge, the combination of objective assessments and subjective opinions to provide recommendation has not been considered before in the literature. Therefore, for the evaluation of the trust and reputation model defined in this PhD thesis, a new simulation tool will be developed following the agent-oriented programming paradigm. The multi-agent approach allows an easy modelling of independent and proactive behaviours for the simulation of users of the system, conforming a faithful resemblance of real users of TEL platforms. For the evaluation of the proposed work, an iterative approach have been followed, testing the performance of the trust and reputation model while providing recommendation in a varied range of scenarios. A comparison with two traditional recommendation mechanisms was performed: a) using only users’ past opinions about a resource and/or other users; and b) not using any reputation assessment and providing the recommendation considering directly the objective quality of the resources. The results show that the developed model improves traditional approaches at providing recommendations in Technology Enhanced Learning (TEL) platforms, presenting a higher adaptability to different situations, whereas traditional approaches only have good results under favourable conditions. Furthermore the promotion period mechanism implemented successfully helps new users in the system to be recommended for direct interactions as well as the resources created by them. On the contrary OnlyOpinions fails completely and new users are never recommended, while traditional approaches only work partially. Finally, the agent-oriented programming (AOP) paradigm has proven its validity at modelling users’ behaviours in TEL platforms. Intelligent software agents’ characteristics matched the main requirements of the simulation tool. The proactivity, sociability and adaptability of the developed agents allowed reproducing real users’ actions and attitudes through the diverse situations defined in the evaluation framework. The result were independent users, accessing to different resources and communicating amongst them to fulfil their needs, basing these interactions on the recommendations provided by the reputation engine.
Resumo:
Single channel recordings demonstrate that ion channels switch stochastically between an open and a closed pore conformation. In search of a structural explanation for this universal open/close behavior, we have uncovered a striking degree of amino acid homology across the pore-forming regions of voltage-gated K channels and glutamate receptors. This suggested that the pores of these otherwise unrelated classes of channels could be structurally conserved. Strong experimental evidence supports a hairpin structure for the pore-forming region of K channels. Consequently, we hypothesized the existence of a similar structure for the pore of glutamate receptors. In ligand-gated channels, the pore is formed by M2, the second of four putative transmembrane segments. A hairpin structure for M2 would affect the subsequent membrane topology, inverting the proposed orientation of the next segments, M3. We have tested this idea for the NR1 subunit of the N-methyl-D-aspartate receptor. Mutations that affected the glycosylation pattern of the NR1 subunit localize both extremes of the M3-M4 linker to the extracellular space. Whole cell currents and apparent agonist affinities were not affected by these mutations. Therefore it can be assumed that they represent the native transmembrane topology. The extracellular assignment of the M3-M4 linker challenged the current topology model by inverting M3. Taken together, the amino acid homology and the new topology suggest that the pore-forming M2 segment of glutamate receptors does not transverse the membrane but, rather, forms a hairpin structure, similar to that found in K channels.
Resumo:
We evaluated the role of the larval parasitoid, Diadegma semiclausum Hellen (Hymenoptera: Ichneumonidae), in controlling Plutella xylostella (L.) (Lepidoptera: Plutellidae) by cage exclusion experiments and direct field observation during the winter season in southern Queensland, Australia. The cage exclusion experiment involved uncaged, open cage and closed cage treatments. A higher percentage (54-83%) of P. xylostella larvae on sentinel plants were lost in the uncaged treatment than the closed (4-9%) or open cage treatments (11-29%). Of the larvae that remained in the uncaged treatment, 72-94% were parasitized by D. semiclausum , much higher than that in the open cage treatment (8-37% in first trial, and 38-63% in second trial). Direct observations showed a significant aggregation response of the field D. semiclausum populations to high host density plants in an experimental plot and to high host density plots that were artificially set-up near to the parasitoid source fields. The degree of aggregation varied in response to habitat quality of the parasitoid source field and scales of the manipulated host patches. As a result, density-dependence in the pattern of parasitism may depend on the relative degree of aggregation of the parasitoid population at a particular scale. A high degree of aggregation seems to be necessary to generate density-dependent parasitism by D. semiclausum . Integration of the cage exclusion experiment and direct observation demonstrated the active and dominant role of this parasitoid in controlling P. xylostella in the winter season. A biologically based IPM strategy, which incorporates the use of D. semiclausum with Bt, is suggested for the management of P. xylostella in seasons or regions with a mild temperature.
Resumo:
The aim of this paper is to continue the study of θ-irresolute and quasi-irresolute functions as well as to give an example of a function which is θ-irresolute but neither quasi-irresolute nor an R-map and thus give an answer to a question posed by Ganster, Noiri and Reilly. We prove that RS-compactness is preserved under open, quasi-irresolute surjections.
Resumo:
The miniaturization, sophistication, proliferation, and accessibility of technologies are enabling the capture of more and previously inaccessible phenomena in Parkinson's disease (PD). However, more information has not translated into a greater understanding of disease complexity to satisfy diagnostic and therapeutic needs. Challenges include noncompatible technology platforms, the need for wide-scale and long-term deployment of sensor technology (among vulnerable elderly patients in particular), and the gap between the "big data" acquired with sensitive measurement technologies and their limited clinical application. Major opportunities could be realized if new technologies are developed as part of open-source and/or open-hardware platforms that enable multichannel data capture sensitive to the broad range of motor and nonmotor problems that characterize PD and are adaptable into self-adjusting, individualized treatment delivery systems. The International Parkinson and Movement Disorders Society Task Force on Technology is entrusted to convene engineers, clinicians, researchers, and patients to promote the development of integrated measurement and closed-loop therapeutic systems with high patient adherence that also serve to (1) encourage the adoption of clinico-pathophysiologic phenotyping and early detection of critical disease milestones, (2) enhance the tailoring of symptomatic therapy, (3) improve subgroup targeting of patients for future testing of disease-modifying treatments, and (4) identify objective biomarkers to improve the longitudinal tracking of impairments in clinical care and research. This article summarizes the work carried out by the task force toward identifying challenges and opportunities in the development of technologies with potential for improving the clinical management and the quality of life of individuals with PD. © 2016 International Parkinson and Movement Disorder Society.
Resumo:
Traditional classrooms have been often regarded as closed spaces within which experimentation, discussion and exploration of ideas occur. Professors have been used to being able to express ideas frankly, and occasionally rashly while discussions are ephemeral and conventional student work is submitted, graded and often shredded. However, digital tools have transformed the nature of privacy. As we move towards the creation of life-long archives of our personal learning, we collect material created in various 'classrooms'. Some of these are public, and open, but others were created within 'circles of trust' with expectations of privacy and anonymity by learners. Taking the Creative Commons license as a starting point, this paper looks at what rights and expectations of privacy exist in learning environments? What methods might we use to define a 'privacy license' for learning? How should the privacy rights of learners be balanced with the need to encourage open learning and with the creation of eportfolios as evidence of learning? How might we define different learning spaces and the privacy rights associated with them? Which class activities are 'private' and closed to the class, which are open and what lies between? A limited set of set of metrics or zones is proposed, along the axes of private-public, anonymous-attributable and non-commercial-commercial to define learning spaces and the digital footprints created within them. The application of these not only to the artefacts which reflect learning, but to the learning spaces, and indeed to digital media more broadly are explored. The possibility that these might inform not only teaching practice but also grading rubrics in disciplines where public engagement is required will also be explored, along with the need for consideration by educational institutions of the data rights of students.