992 resultados para Pre-emergence
Resumo:
Study/Objective This study examines the current state of disaster response education for Australian paramedics from a national and international perspective and identifies both potential gaps in content and challenges to the sustainability of knowledge acquired through occasional training. Background As demands for domestic and international disaster response increase, experience in the field has begun to challenge traditional assumptions that response to mass casualty events requires little specialist training. The need for a “streamlined process of safe medical team deployment into disaster regions”1 is generally accepted and, in Australia, the emergence of national humanitarian aid training has begun to respond to this gap. However, calls for a national framework for disaster health education2 haven’t received much traction. Methods A critical analysis of the peer reviewed and grey literature on the core components/competencies and training methods required to prepare Australian paramedics to contribute to effective health disaster response has been conducted. Research from the past 10 years has been examined along with federal and state policy with regard to paramedic disaster education. Results The literature shows that education and training for disaster response is variable and that an evidence based study specifically designed to outline sets of core competencies for Australian health care professionals has never been undertaken. While such competencies in disaster response have been developed for the American paradigm it is suggested that disaster response within the Australian context is somewhat different to that of the US, and therefore a gap in the current knowledge base exists. Conclusion Further research is needed to develop core competencies specific to Australian paramedics in order to standardise teaching in the area of health disaster management. Until this occurs the task of evaluating or creating disaster curricula that adequately prepares and maintains paramedics for an effective all hazards disaster response is seen as largely unattainable.
Resumo:
The objectives of this study were to determine the impact of different instructional constraints on standing board jump (sbj) performance in children and understand the underlying changes in emergent movement patterns. Two groups of novice participants were provided with either externally or internally focused attentional instructions during an intervention phase. Pre- and post-test sessions were undertaken to determine changes to performance and movement patterns. Thirty-six primary fourth-grade male students were recruited for this study and randomly assigned to either an external, internal focus or control group. Different instructional constraints with either an external focus (image of the achievement) or an internal focus (image of the act) were provided to the participants. Performance scores (jump distances), and data from key kinematic (joint range of motion, ROM) and kinetic variables (jump impulses) were collected. Instructional constraints with an emphasis on an external focus of attention were generally more effective in assisting learners to improve jump distances. Intra-individual analyses highlighted how enhanced jump distances for successful participants may be concomitant with specific changes to kinematic and kinetic variables. Larger joint ROM and adjustment to a comparatively larger horizontal impulse to a vertical impulse were observed for more successful participants at post-test performance. From a constraints-led perspective, the inclusion of instructional constraints encouraging self-adjustments in the control of movements (i.e., image of achievement) had a beneficial effect on individuals performing the standing broad jump task. However, the advantage of using an external focus of attentional instructions could be task- and individual-specific.
Resumo:
Drought during the pre-flowering stage can increase yield of peanut. There is limited information on genotypic variation for tolerance to and recovery from pre-flowering drought (PFD) and more importantly the physiological traits underlying genotypic variation. The objectives of this study were to determine the effects of moisture stress during the pre-flowering phase on pod yield and to understand some of the physiological responses underlying genotypic variation in response to and recovery from PFD. A glasshouse and field experiments were conducted at Khon Kaen University, Thailand. The glasshouse experiment was a randomized complete block design consisting of two watering regimes, i.e. fully-irrigated control and 1/3 available soil water from emergence to 40 days after emergence followed by adequate water supply, and 12 peanut genotypes. The field experiment was a split-plot design with two watering regimes as main-plots, and 12 peanut genotypes as sub-plots. Measurements of N-2 fixation, leaf area (LA) were made in both experiments. In addition, root growth was measured in the glasshouse experiment. Imposition of PFD followed by recovery resulted in an average increase in yield of 24 % (range from 10 % to 57 %) and 12 % (range from 2 % to 51 %) in the field and glasshouse experiments, respectively. Significant genotypic variation for N-2 fixation, LA and root growth was also observed after recovery. The study revealed that recovery growth following release of PFD had a stronger influence on final yield than tolerance to water deficits during the PFD. A combination of N-2 fixation, LA and root growth accounted for a major portion of the genotypic variation in yield (r = 0.68-0.93) suggesting that these traits could be used as selection criteria for identifying genotypes with rapid recovery from PFD. A combined analysis of glasshouse and field experiments showed that LA and N-2 fixation during the recovery had low genotype x environment interaction indicating potential for using these traits for selecting genotypes in peanut improvement programs.
Resumo:
We present the results of the microstratigraphic, phytolith and wood charcoal study of the remains of a 10.5 ka roof. The roof is part of a building excavated at Tell Qarassa (South Syria), assigned to the Pre-Pottery Neolithic B period (PPNB). The Pre-Pottery Neolithic (PPN) period in the Levant coincides with the emergence of farming. This fundamental change in subsistence strategy implied the shift from mobile to settled aggregated life, and from tents and huts to hard buildings. As settled life spread across the Levant, a generalised transition from round to square buildings occurred, that is a trademark of the PPNB period. The study of these buildings is fundamental for the understanding of the ever-stronger reciprocal socio-ecological relationship humans developed with the local environment since the introduction of sedentism and domestication. Descriptions of buildings in PPN archaeological contexts are usually restricted to the macroscopic observation of wooden elements (posts and beams) and mineral components (daub, plaster and stone elements). Reconstructions of microscopic and organic components are frequently based on ethnographic analogy. The direct study of macroscopic and microscopic, organic and mineral, building components performed at Tell Qarassa provides new insights on building conception, maintenance, use and destruction. These elements reflect new emerging paradigms in the relationship between Neolithic societies and the environment. A square building was possibly covered here with a radial roof, providing a glance into a topologic shift in the conception and understanding of volumes, from round-based to square-based geometries. Macroscopic and microscopic roof components indicate buildings were conceived for year-round residence rather than seasonal mobility. This implied performing maintenance and restoration of partially damaged buildings, as well as their adaptation to seasonal variability
Resumo:
L’augmentation des interactions entre humains et animaux sauvages en lisière des habitats naturels pourrait faciliter la transmission d’agents pathogènes entre les humains et les différentes espèces animales d’un écosystème et ainsi favoriser l’émergence de maladies. Nous avons effectué une étude transversale portant sur l’infection par Giardia et Cryptosporidium chez les humains, les animaux domestiques, les rongeurs et les lémuriens au sein de l’écosystème de Ranomafana, Madagascar. Des échantillons de fèces ont étés collectés de manière non invasive chez des personnes volontaires, des mammifères domestiques et des rongeurs introduits habitant trois villages situés en lisière du Parc National de Ranomafana (PNR) ainsi que quatre espèces de lémuriens (Propithecus edwardsii, Prolemur simus, Eulemur rubriventer et Microcebus rufus) du PNR. Des analyses coproscopiques par la technique d’immunofluorescence directe ont été réalisées afin de détecter la présence de Cryptosporidium et Giardia. Leur prévalence a été estimée et certaines variables reliées à l’infection par les parasites ont été identifiées. Cryptosporidium et Giardia ont été détectés avec une prévalence estimée à 22,9 % et 13,6 % respectivement chez les humains. La prévalence de ces deux parasites variait de 0 % à 60 % chez les animaux domestiques et les rongeurs au sein des villages. L’espèce hôte, l’âge ainsi que la co-infection par un autre protozoaire sont les seules variables associées à l’infection par Cryptosporidium et Giardia dans cet écosystème tandis qu’aucune association avec une coinfection par un ordre de nématode n’a été détecté. De plus, Cryptosporidium a été détecté chez 10,5 % des lémuriens du PNR. Cette étude documente pour la première fois la présence de Cryptosporidium chez deux espèces de lémuriens du PNR. Par contre, Giardia n’a pas été détecté dans les échantillons issus de lémuriens du PNR.
Resumo:
Recent excavations at Pre-Pottery Neolithic A (PPNA) WF16 in southern Jordan have revealed remarkable evidence of architectural developments in the early Neolithic. This sheds light on both special purpose structures and “domestic” settlement, allowing fresh insights into the development of increasingly sedentary communities and the social systems they supported. The development of sedentary communities is a central part of the Neolithic process in Southwest Asia. Architecture and ideas of homes and households have been important to the debate, although there has also been considerable discussion on the role of communal buildings and the organization of early sedentarizing communities since the discovery of the tower at Jericho. Recently, the focus has been on either northern Levantine PPNA sites, such as Jerf el Ahmar, or the emergence of ritual buildings in the Pre-Pottery Neolithic B of the southern Levant. Much of the debate revolves around a division between what is interpreted as domestic space, contrasted with “special purpose” buildings. Our recent evidence allows a fresh examination of the nature of early Neolithic communities.
Resumo:
Understanding how the emergence of the anthropogenic warming signal from the noise of internal variability translates to changes in extreme event occurrence is of crucial societal importance. By utilising simulations of cumulative carbon dioxide (CO2) emissions and temperature changes from eleven earth system models, we demonstrate that the inherently lower internal variability found at tropical latitudes results in large increases in the frequency of extreme daily temperatures (exceedances of the 99.9th percentile derived from pre-industrial climate simulations) occurring much earlier than for mid-to-high latitude regions. Most of the world's poorest people live at low latitudes, when considering 2010 GDP-PPP per capita; conversely the wealthiest population quintile disproportionately inhabit more variable mid-latitude climates. Consequently, the fraction of the global population in the lowest socio-economic quintile is exposed to substantially more frequent daily temperature extremes after much lower increases in both mean global warming and cumulative CO2 emissions.
Resumo:
This paper will discuss the emergence of Shiʿite mourning rituals around the grave of Husayn b. ʿAli. After the killing of Husayn at Karbala’ in 61/680, a number of men in Kufa feel deep regret for their neglect to come to the help of the grandson of the Prophet. They gather and discuss how they can best make penitence for this crime. Eventually, they decide to take to arms and go against the Umayyad army – to kill those that killed Husayn, or be killed themselves in the attempt to find revenge for him. Thus, they are called the Penitents (Ar. Tawwābūn). On their way to the battlefield they stop at Husayn’s tomb at Karbala’, dedicating themselves to remorseful prayer, crying and wailing over the fate of Husayn and their own sin. When the Penitents perform certain ritual acts, such as weeping and wailing over the death of Husayn, visiting his grave, asking for God’s mercy upon him on the Day of Judgment, demand blood revenge for him etc., they enter into already existing rituals in the pre-Islamic Arab and early Muslim context. That is, they enter into rituals that were traditionally performed at the death of a person. What is new is that the rituals that the Penitents perform have partially received a new content. As described, the rituals are performed out of loyalty towards Husayn and the family of the Prophet. The lack of loyalty in connection with the death of Husayn is conceived of as a sin that has to be atoned. Blood revenge thus becomes not only a pure action of revenge to restore honor, but equally an expression for true religious conversion and penitence. Humphrey and Laidlaw argue that ritual actions in themselves are not bearers of meaning, but that they are filled with meaning by the performer. According to them, ritual actions are apprehensible, i.e. they can be, and should be filled with meaning, and the people who perform them try to do so within the context where the ritual is performed. The story of the Penitents is a clear example of mourning rituals as actions that survive from earlier times, but that are now filled with new meaning when they are performed in a new and developing movement with a different ideology. In later Shiʿism, these rituals are elaborated and become a main tenet of this form of Islam.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Background and Objectives - Sevoflurane is an inhalational anesthetic drug with low blood/gas solubility providing fast anesthesia induction and emergence. Its ability to maintain cardiovascular stability makes it ideal for pediatric anesthesia. The aim of this study was to evaluate hemodynamic stability, consumption of inhalational anesthetics and emergence time in children with and without premedication (midazolam or clonidine) anesthetized with sevoflurane titrated according to BIS monitoring. Methods - Participated in this study 30 patients aged 2 to 12 years, physical status ASA I, undergoing elective surgeries who were divided into 3 groups: G1 - without premedication, G2 - 0.5 mg.kg-1 oral midazolam, G3 - 4 μg.kg-1 oral clonidine 60 minutes before surgery. All patients received 30 μg.kg-1 alfentanil, 3 mg.kg-1 propofol, 0.5 mg.kg-1 atracurium, sevoflurane in different concentrations monitored by BIS (values close to 60) and N2O in a non rebreathing system. Systolic and diastolic blood pressure, heart rate, expired sevoflurane concentration (EC), sevoflurane consumption (ml.min-1) and emergence time were evaluated. Emergence time was defined as time elapsed between the end of anesthesia and patients' spontaneous movements trying to extubate themselves, crying and opening eyes and mouth. Results - There were no differences among groups as to systolic and diastolic blood pressure, EC, sevoflurane consumption and emergence time. Heart rate was lower in G3 group. Conclusions - Sevoflurane has provided hemodynamic stability. Premedication with clonidine and midazolam did not influence emergence time, inhaled anesthetic consumption or maintenance of anesthesia with sevoflurane. Anesthesia duration has also not influenced emergence time. Hypnosis monitoring was important for balancing anesthetic levels and this might have been responsible for the similarity of emergence times for all studied groups.
Resumo:
This is the first record of Acanthoscelides schrankiae Horn, feeding in seeds of Mimosa bimucronata (DC.) Kuntze. We investigated the pattern of oviposition and seed exploitation by A. schrankiae, and the distribution of mature fruits and seed predation in the inflorescences. We also compared the percentage of predated seeds, the total dry weight of fruits and non-predated seeds, the percentage of aborted seeds, and the percentage of non-emergent insects, among different quadrants of the M. bimucronata canopy. To determine the occurring species, the emergence of bruchids and parasitoids was observed in the laboratory, resulting altogether, only in individuals of A. schrankiae and Horismenus sp. (Hymenoptera: Eulophidae) species, respectively. Mean number of fruits produced in the median region of inflorescence was significantly higher than in the inferior and superior regions, and the frequencies (observed and expected) of predated and non-predated seeds differed among the different regions of inflorescence. Females of A. schrankiae laid their eggs on fruits, and larvae, after emergence, perforated the exocarp to reach the seeds. Most fruits presented one to three eggs and only one bruchid larva was observed in each seed. The highest value of the rate number of eggs/fruit and the highest percentage of predated seeds were recorded in April. Dry weight of fruits (total) and seeds (non-predated), proportions of predated seeds, seed abortions, and non-emergent seed predators, were evenly distributed in the canopy.
Resumo:
BACKGROUND Drug resistance is a major barrier to successful antiretroviral treatment (ART). Therefore, it is important to monitor time trends at a population level. METHODS We included 11,084 ART-experienced patients from the Swiss HIV Cohort Study (SHCS) between 1999 and 2013. The SHCS is highly representative and includes 72% of patients receiving ART in Switzerland. Drug resistance was defined as the presence of at least one major mutation in a genotypic resistance test. To estimate the prevalence of drug resistance, data for patients with no resistance test was imputed based on patient's risk of harboring drug resistant viruses. RESULTS The emergence of new drug resistance mutations declined dramatically from 401 to 23 patients between 1999 and 2013. The upper estimated prevalence limit of drug resistance among ART-experienced patients decreased from 57.0% in 1999 to 37.1% in 2013. The prevalence of three-class resistance decreased from 9.0% to 4.4% and was always <0.4% for patients who initiated ART after 2006. Most patients actively participating in the SHCS in 2013 with drug resistant viruses initiated ART before 1999 (59.8%). Nevertheless, in 2013, 94.5% of patients who initiated ART before 1999 had good remaining treatment options based on Stanford algorithm. CONCLUSION HIV-1 drug resistance among ART-experienced patients in Switzerland is a well-controlled relic from the pre-combination ART era. Emergence of drug resistance can be virtually stopped with new potent therapies and close monitoring.
Resumo:
The Ink4a/Arf locus encodes p16Ink4a and p19Arf and is among the most frequently mutated tumor suppressor loci in human cancer. In mice, many of these effects appear to be mediated by interactions between p19Arf and the p53 tumor-suppressor protein. Because Tp53 mutations are a common feature of the multistep pre-B cell transformation process mediated by Abelson murine leukemia virus (Ab-MLV), we examined the possibility that proteins encoded by the Ink4a/Arf locus also play a role in Abelson virus transformation. Analyses of primary transformants revealed that both p16Ink4a and p19Arf are expressed in many of the cells as they emerge from the apoptotic crisis that characterizes the transformation process. Analyses of primary transformants from Ink4a/Arf null mice revealed that these cells bypassed crisis. Because expression of p19Arf but not p16 Ink4a induced apoptosis in Ab-MLV-transformed pre-B cells, p19Arf appears to be responsible for these events. Consistent with the link between p19Arf and p53, Ink4a/Arf expression correlates with or precedes the emergence of cells expressing mutant p53. These data demonstrate that p19Arf is an important part of the cellular defense mounted against transforming signals from the Abl oncoprotein and provide direct evidence that the p19Arf–p53 regulatory loop plays an important role in lymphoma induction.
Resumo:
The emergence of modern humans in the Late Pleistocene, whatever its phylogenetic history, was characterized by a series of behaviorally important shifts reflected in aspects of human hard tissue biology and the archeological record. To elucidate these shifts further, diaphyseal cross-sectional morphology was analyzed by using cross-sectional areas and second moments of area of the mid-distal humerus and midshaft femur. The humeral diaphysis indicates a gradual reduction in habitual load levels from Eurasian late archaic, to Early Upper Paleolithic early modern, to Middle Upper Paleolithic early modern hominids, with the Levantine Middle Paleolithic early modern humans being a gracile anomalous outlier. The femoral diaphysis, once variation in ecogeographically patterned body proportions is taken into account, indicates no changes across the pre-30,000 years B.P. samples in habitual locomotor load levels, followed by a modest decrease through the Middle Upper Paleolithic.