904 resultados para Guide for ways to support the most vulnerable families in society
Resumo:
A semi-selective agar medium was developed for detection of Xanthomonas axonopodis pv. malvacearum (Xam) in cotton (Gossypium hirsutum) seed. The basic medium was peptone-sucrose-agar (PSA). Criteria for the semi-selective medium were the typical colony characters of Xam and its pathogenicity on cotton. Several systemic fungicides and antibiotics in different concentrations were tested alone or in combination with others. The final composition of the semi-selective agar medium was established after several attempts in order to inhibit most of the fungal and bacterial saprophytes and favour the development of Xam. It contained PSA + cyclohexamide, cephalexin, pencycuron, triadimenol and tolylfluanid. The bacteria were recovered from naturally infected seeds by the direct plating of 2,000 surface disinfected seeds on the semi-selective medium. The recovery of the pathogen from naturally infected leaf tissues and in dilution plating, on semi-selective medium and on nutrient agar, were comparable. Among the three detection methods tested, the semi-selective medium was found to be the most reliable and quantifiable. Degree of severity of angular leaf spot in the field was not always correlated with the level of infection in the seed. This is the first report of a semi-selective agar medium to detect the presence of Xam in naturally infected cotton seed.
Resumo:
Netnography has been studying in various aspects (e.g. definitions of netnography, application of netngoraphy, conducting procedure…) within different industrial contexts. Besides, there are many studies and researches about new product development in various perspectives, such as new product development models, management of new product development project, or interaction between customers and new product design, and so on. However, the connection and the interaction between netnography and new product development have not been studied recently. This opens opportunities for the writer to study and explore unrevealed issues regarding to applying netnography in new product development. In term of the relation between netnography and new product development, there are numerous of matters need to be explored; for instance, the process of applying netnography in order to benefit to new product development, the involvement degree of netnography in new product development process, or eliminating useless information from netnography so that only crucial data is utilized, and so on. In this thesis, writer focuses on exploring how netnography is applied in new product development process, and what benefits netnography can contribute to the succeed of the project. The aims of this study are to understand how netnography is conducted for new product development purpose, and to analyse the contributions of netnography in the new product development process. To do so, a case-study strategy will be conducted with triple case studies. The case studies are chosen bases on many different criteria in order to select the most relevant cases. Eventually, the writer selected three case studies, which are Sunless tanning product project (HYVE), Listerine (NetBase), and Nivea co-creation and netnography in black and white deodorant. The case study strategy applied in this thesis includes four steps e.g. case selection, data collection, case study analysis, and generating the research outcomes from the analysis. This study of the contributions of netnography in the new product development process may be useful for the readers in many ways. It offers the fundamental knowledge of netnography market research method and basic understanding of new product development process. Additionally, it emphasizes the differences between netnography and other market research methods in order to explain the reasons why many companies and market research agents recently utilized netnography in their market research projects. Furthermore, it highlights the contributions of netnography in the new product development process in order to indicate the importance of netnography in developing new product. Thus, the potential readers of the study can be students, marketers, researchers, product developers, or business managers.
Resumo:
Cerradão vegetation shares many species with savanna and forest areas and is one of the most vulnerable phytophysiognomies in the Cerrado (Brazilian savanna) biome. The floristic composition of the Cerradão Biological Reserve was examined between September/2007 and November/2008. A total of 282 species distributed among 194 genera and 75 families were encountered, demonstrating proportions of 0.91 herbaceous species and 0.54 shrub species for each tree species. Fabaceae, Asteraceae, Rubiaceae, Poaceae, Myrtaceae, Malpighiaceae, and Melastomataceae were the most species-rich families. Fully 72.3% of the species of this dystrophic cerradão were shared by cerrado and forest vegetations, while 60.43% were shared by other cerradão sites, although the largest proportion of species (91%) were shared with cerrado sensu stricto. No species was found to be exclusive to this cerradão site, but approximately 95% of all species were native to the Cerrado biome.
Resumo:
Today’s healthcare organizations are under constant pressure for change, as hospitals should be able to offer their patients the best possible medical care with limited resources and, at the same time, to retain steady efficiency level in their operation. This is challenging, especially in trauma hospitals, in which the variation in the patient cases and volumes is relatively high. Furthermore, the trauma patient's care requires plenty of resources as most the patients have to be treated as single cases. Occasionally, the sudden increases in demand causes congestion in the operations of the hospital, which in Töölö hospital appears as an increase in the surgery waiting times within the yellow urgency class patients. An increase in the surgery waiting times may cause the diminution of the patient's condition, which also raises the surgery risks. The congestion itself causes overloading of the hospital capacity and staff. The aim of this master’s thesis is to introduce the factors contributing to the trauma process, and to examine the correlation between the different variables and the lengthened surgery waiting times. The results of this study are based on a three-year patient data and different quantitative analysis. Based on the analysis, a daily usable indicator was created in order to support the decision making in the operations management. By using the selected indicator, the effects of congestion can be acknowledged and the corrective action can also be taken more proactively.
Resumo:
Human serum albumin (HSA) is the most abundant protein in the intravascular compartment. It possesses a single thiol, Cys34, which constitutes ~80% of the total thiols in plasma. This thiol is able to scavenge plasma oxidants. A central intermediate in this potential antioxidant activity of human serum albumin is sulfenic acid (HSA-SOH). Work from our laboratories has demonstrated the formation of a relatively stable sulfenic acid in albumin through complementary spectrophotometric and mass spectrometric approaches. Recently, we have been able to obtain quantitative data that allowed us to measure the rate constants of sulfenic acid reactions with molecules of analytical and biological interest. Kinetic considerations led us to conclude that the most likely fate for sulfenic acid formed in the plasma environment is the reaction with low molecular weight thiols to form mixed disulfides, a reversible modification that is actually observed in ~25% of circulating albumin. Another possible fate for sulfenic acid is further oxidation to sulfinic and sulfonic acids. These irreversible modifications are also detected in the circulation. Oxidized forms of albumin are increased in different pathophysiological conditions and sulfenic acid lies in a mechanistic junction, relating oxidizing species to final thiol oxidation products.
Resumo:
Within the framework of state security policy, the focus of this dissertation are the relations between how new security threats are perceived and the policy planning and bureaucratic implementation that are designed to address them. In addition, this thesis explores and studies some of the inertias that might exist in the core of the state apparatus as it addresses new threats and how these could be better managed. The dissertation is built on five thematic and interrelated articles highlighting different aspects of when new significant national security threats are detected by different governments until the threats on the policy planning side translate into protective measures within the society. The timeline differs widely between different countries and some key aspects of this process are also studied. One focus concerns mechanisms for adaptability within the Intelligence Community, another on the policy planning process within the Cabinet Offices/National Security Councils and the third focus is on the planning process and how policy is implemented within the bureaucracy. The issue of policy transfer is also analysed, revealing that there is some imitation of innovation within governmental structures and policies, for example within the field of cyber defence. The main findings of the dissertation are that this context has built-in inertias and bureaucratic seams found in most government bureaucratic machineries. As much of the information and planning measures imply security classification of the transparency and internal debate on these issues, alternative assessments become limited. To remedy this situation, the thesis recommends ways to improve the decision-making system in order to streamline the processes involved in making these decisions. Another special focus of the thesis concerns the role of the public policy think tanks in the United States as an instrument of change in the country’s national security decision-making environment, which is viewed from the perspective as being a possible source of new ideas and innovation. The findings in this part are based on unique interviews data on how think tanks become successful and influence the policy debate in a country such as the United States. It appears clearly that in countries such as the United States think tanks smooth the decision making processes, and that this model with some adaptations also might be transferrable to other democratic countries.
Resumo:
The removal of organics from copper electrolyte solutions after solvent extraction by dual media filtration is one of the most efficient ways to ensure the clean electrolyte flow into the electrowinning. The clean electrolyte will ensure the good quality cathode plate production. Dual media filtration uses two layers of filter media for filtration as anthracite and garnet respectively. The anthracite layer will help the coalescing of the entrained organic droplets which will then float to the top of the filter, and back to the solvent extraction process. The garnet layer will catch any solids left in the electrolyte traveling through the filter media. This thesis will concentrate on characterization of five different anthracites in order to find some differences using specific surface area analysis, particle size analysis, and morphology analysis. These results are compared to the pressure loss values obtained from lab column tests and bed expansion behavior. The goal of the thesis was to find out if there were any differences in the anthracite which would make the one perform better than the other. There were no big differences found on any aspect of the particle characterization, but some found differences should be further studied in order to confirm the meaning of the porosity, surface area, intensity mean and intensity SD (Standard Deviation) on anthracites and their use in dual media filtration. The thesis work analyzed anthracite samples the way that is not found on any public literature sources, and further studies on the issue would bring more knowledge to the electrolyte process.
Resumo:
On this article, the biography and work of one of the most influential scientists in psychology history is briefl y introduced. With his work, he laid the bases for the scientific study not only on personality, but also on human behaviour. Hence, the most important contributions done by this author are highlighted in a wide range of areas of our discipline, as well as the vision he had about how psychology should be as a science. A series of considerations related to the current situation of scientific psychology in Argentina, lead us to the conclusion that it is essential to rescue his work from forgetfulness, apart from going back over some of his lines of research and thoughts.
Resumo:
We design a financial network model that explicitly incorporates linkages across institutions through a direct contagion channel, as well as an indirect common exposure channel. In particular, common exposure is setup so as to link the financial to the real sector. The model is calibrated to balance sheet data on the colombian financial sector. Results indicate that commercial banks are the most systemically important financial institutions in the system. Whereas government owned institutions are the most vulnerable institutions in the system.
Resumo:
MPJ Express is our implementation of MPI-like bindings for Java. In this paper we discuss our intermediate buffering layer that makes use of the so-called direct byte buffers introduced in the Java New I/O package. The purpose of this layer is to support the implementation of derived datatypes. MPJ Express is the first Java messaging library that implements this feature using pure Java. In addition, this buffering layer allows efficient implementation of communication devices based on proprietary networks such as Myrinet. In this paper we evaluate the performance of our buffering layer and demonstrate the usefulness of direct byte buffers. Also, we evaluate the performance of MPJ Express against other messaging systems using Myrinet and show that our buffering layer has made it possible to avoid the overheads suffered by other Java systems such as mpiJava that relies on the Java Native Interface.
Resumo:
MPJ Express is our implementation of MPI-like bindings for Java. In this paper we discuss our intermediate buffering layer that makes use of the so-called direct byte buffers introduced in the Java New I/O package. The purpose of this layer is to support the implementation of derived datatypes. MPJ Express is the first Java messaging library that implements this feature using pure Java. In addition, this buffering layer allows efficient implementation of communication devices based on proprietary networks such as Myrinet. In this paper we evaluate the performance of our buffering layer and demonstrate the usefulness of direct byte buffers. Also, we evaluate the performance of MPJ Express against other messaging systems using Myrinet and show that our buffering layer has made it possible to avoid the overheads suffered by other Java systems such as mpiJava that relies on the Java Native Interface.
Resumo:
The synoptic evolution and some meteorological impacts of the European winter storm Kyrill that swept across Western, Central, and Eastern Europe between 17 and 19 January 2007 are investigated. The intensity and large storm damage associated with Kyrill is explained based on synoptic and mesoscale environmental storm features, as well as on comparisons to previous storms. Kyrill appeared on weather maps over the US state of Arkansas about four days before it hit Europe. It underwent an explosive intensification over the Western North Atlantic Ocean while crossing a very intense zonal polar jet stream. A superposition of several favourable meteorological conditions west of the British Isles caused a further deepening of the storm when it started to affect Western Europe. Evidence is provided that a favourable alignment of three polar jet streaks and a dry air intrusion over the occlusion and cold fronts were causal factors in maintaining Kyrill's low pressure very far into Eastern Europe. Kyrill, like many other strong European winter storms, was embedded in a pre-existing, anomalously wide, north-south mean sea-level pressure (MSLP) gradient field. In addition to the range of gusts that might be expected from the synoptic-scale pressure field, mesoscale features associated with convective overturning at the cold front are suggested as the likely causes for the extremely damaging peak gusts observed at many lowland stations during the passage of Kyrill's cold front. Compared to other storms, Kyrill was by far not the most intense system in terms of core pressure and circulation anomaly. However, the system moved into a pre-existing strong MSLP gradient located over Central Europe which extended into Eastern Europe. This fact is considered determinant for the anomalously large area affected by Kyrill. Additionally, considerations of windiness in climate change simulations using two state-of-the-art regional climate models driven by ECHAM5 indicate that not only Central, but also Eastern Central Europe may be affected by higher surface wind speeds at the end of the 21st century. These changes are partially associated with the increased pressure gradient over Europe which is identified in the ECHAM5 simulations. Thus, with respect to the area affected, as well as to the synoptic and mesoscale storm features, it is proposed that Kyrill may serve as an interesting study case to assess future storm impacts.
Resumo:
In this article, we investigate how the choice of the attenuation factor in an extended version of Katz centrality influences the centrality of the nodes in evolving communication networks. For given snapshots of a network, observed over a period of time, recently developed communicability indices aim to identify the best broadcasters and listeners (receivers) in the network. Here we explore the attenuation factor constraint, in relation to the spectral radius (the largest eigenvalue) of the network at any point in time and its computation in the case of large networks. We compare three different communicability measures: standard, exponential, and relaxed (where the spectral radius bound on the attenuation factor is relaxed and the adjacency matrix is normalised, in order to maintain the convergence of the measure). Furthermore, using a vitality-based measure of both standard and relaxed communicability indices, we look at the ways of establishing the most important individuals for broadcasting and receiving of messages related to community bridging roles. We compare those measures with the scores produced by an iterative version of the PageRank algorithm and illustrate our findings with two examples of real-life evolving networks: the MIT reality mining data set, consisting of daily communications between 106 individuals over the period of one year, a UK Twitter mentions network, constructed from the direct \emph{tweets} between 12.4k individuals during one week, and a subset the Enron email data set.