809 resultados para Parallel Work Experience, Practise, Architecture
Resumo:
BACKGROUND: Chronic idiopathic urticaria (CIU) is defined by the almost daily presence of urticaria for at least 6 weeks without an identifiable cause. Symptoms include short-lived wheals, itching, and erythema. CIU impedes significantly a patient's quality of life (QoL). Levocetirizine is an antihistamine from the latest generation approved for CIU. AIM: To investigate the efficacy of levocetirizine, 5 mg, and placebo for the symptoms and signs of CIU, as well as for the QoL and productivity. METHODS: The primary criteria of evaluation were the pruritus severity scores over 1 week of treatment and over 4 weeks. The QoL was assessed via the Dermatology Life Quality Index (DLQI). RESULTS: Baseline pruritus severity scores were comparable in the two treatment groups (2.06+/-0.58). After 1 week, levocetirizine was superior to placebo and demonstrated a considerable efficacy (difference=0.78, P<0.001). This efficacy was maintained over the entire study period (4 weeks, P<0.001). The number and size of wheals were considerably reduced compared with placebo over 1 week and over the total treatment period (P
Resumo:
GOALS OF WORK: The aim of this study was to evaluate pain intensity and the application of the WHO guidelines for cancer pain treatment in patients with prostate cancer treated at Swiss cancer centers. MATERIALS AND METHODS: We analyzed a series of five multicenter phase II clinical trials which examined the palliative effect of different chemotherapies in patients with advanced hormone-refractory prostate carcinoma. Of 170 patients, 1,018 visits were evaluable for our purpose, including ratings of pain intensity by patients and prescribed analgesics. MAIN RESULTS: No or mild pain was indicated by patients in 36 to 55% of the visits, more than mild pain in 30 to 46%. In 21% of the visits, the WHO pain treatment criteria (treatment according to one of the three steps; oral, rectal or transdermal application of the main dose; administration on a regular schedule) were fulfilled, and the Cleeland index was positive according to all recommendations. In 6% of the visits, neither the WHO criteria were fulfilled nor was the Cleeland index positive. This indicates insufficient pain treatment not following the WHO guidelines and that the prescribed analgesics were not sufficiently potent for the rated pain intensity. CONCLUSIONS: In this selective Swiss sample, the standard of analgesic treatment is high. However, there is still scope for improvement. This cannot solely be solved by improving the knowledge of the physicians. Programs to change the patients' attitude towards cancer pain, training to improve the physicians' communication skills, and institutional changes may be promising strategies.
Resumo:
CONCLUSION: Our self-developed planning and navigation system has proven its capacity for accurate surgery on the anterior and lateral skull base. With the incorporation of augmented reality, image-guided surgery will evolve into 'information-guided surgery'. OBJECTIVE: Microscopic or endoscopic skull base surgery is technically demanding and its outcome has a great impact on a patient's quality of life. The goal of the project was aimed at developing and evaluating enabling navigation surgery tools for simulation, planning, training, education, and performance. This clinically applied technological research was complemented by a series of patients (n=406) who were treated by anterior and lateral skull base procedures between 1997 and 2006. MATERIALS AND METHODS: Optical tracking technology was used for positional sensing of instruments. A newly designed dynamic reference base with specific registration techniques using fine needle pointer or ultrasound enables the surgeon to work with a target error of < 1 mm. An automatic registration assessment method, which provides the user with a color-coded fused representation of CT and MR images, indicates to the surgeon the location and extent of registration (in)accuracy. Integration of a small tracker camera mounted directly on the microscope permits an advantageous ergonomic way of working in the operating room. Additionally, guidance information (augmented reality) from multimodal datasets (CT, MRI, angiography) can be overlaid directly onto the surgical microscope view. The virtual simulator as a training tool in endonasal and otological skull base surgery provides an understanding of the anatomy as well as preoperative practice using real patient data. RESULTS: Using our navigation system, no major complications occurred in spite of the fact that the series included difficult skull base procedures. An improved quality in the surgical outcome was identified compared with our control group without navigation and compared with the literature. The surgical time consumption was reduced and more minimally invasive approaches were possible. According to the participants' questionnaires, the educational effect of the virtual simulator in our residency program received a high ranking.
Resumo:
An important problem in computational biology is finding the longest common subsequence (LCS) of two nucleotide sequences. This paper examines the correctness and performance of a recently proposed parallel LCS algorithm that uses successor tables and pruning rules to construct a list of sets from which an LCS can be easily reconstructed. Counterexamples are given for two pruning rules that were given with the original algorithm. Because of these errors, performance measurements originally reported cannot be validated. The work presented here shows that speedup can be reliably achieved by an implementation in Unified Parallel C that runs on an Infiniband cluster. This performance is partly facilitated by exploiting the software cache of the MuPC runtime system. In addition, this implementation achieved speedup without bulk memory copy operations and the associated programming complexity of message passing.
Resumo:
Linear programs, or LPs, are often used in optimization problems, such as improving manufacturing efficiency of maximizing the yield from limited resources. The most common method for solving LPs is the Simplex Method, which will yield a solution, if one exists, but over the real numbers. From a purely numerical standpoint, it will be an optimal solution, but quite often we desire an optimal integer solution. A linear program in which the variables are also constrained to be integers is called an integer linear program or ILP. It is the focus of this report to present a parallel algorithm for solving ILPs. We discuss a serial algorithm using a breadth-first branch-and-bound search to check the feasible solution space, and then extend it into a parallel algorithm using a client-server model. In the parallel mode, the search may not be truly breadth-first, depending on the solution time for each node in the solution tree. Our search takes advantage of pruning, often resulting in super-linear improvements in solution time. Finally, we present results from sample ILPs, describe a few modifications to enhance the algorithm and improve solution time, and offer suggestions for future work.
Resumo:
This dissertation is a report on a collaborative project between the Computer Science and the Humanities Departments to develop case studies that focus on issues of communication in the workplace, and the results of their use in the classroom. My argument is that case study teaching simulates real-world experience in a meaningful way, essentially developing a teachable way of developing phronesis, the reasoned capacity to act for the good in public. In addition, it can be read as a "how-to" guide for educators who may wish to construct their own case studies. To that end, I have included a discussion of the ethnographic methodologies employed, and how it was adapted to our more pragmatic ends. Finally, I present my overarching argument for a new appraisal of the concept of techné. This reappraisal emphasizes its productive activity, poiesis, rather than focusing on its knowledge as has been the case in the past. I propose that focusing on the telos, the end outside the production, contributes to the diminishment, if not complete foreclosure, of a rich concept of techné.
Resumo:
Small clusters of gallium oxide, technologically important high temperature ceramic, together with interaction of nucleic acid bases with graphene and small-diameter carbon nanotube are focus of first principles calculations in this work. A high performance parallel computing platform is also developed to perform these calculations at Michigan Tech. First principles calculations are based on density functional theory employing either local density or gradient-corrected approximation together with plane wave and gaussian basis sets. The bulk Ga2O3 is known to be a very good candidate for fabricating electronic devices that operate at high temperatures. To explore the properties of Ga2O3 at nonoscale, we have performed a systematic theoretical study on the small polyatomic gallium oxide clusters. The calculated results find that all lowest energy isomers of GamOn clusters are dominated by the Ga-O bonds over the metal-metal or the oxygen-oxygen bonds. Analysis of atomic charges suggest the clusters to be highly ionic similar to the case of bulk Ga2O3. In the study of sequential oxidation of these slusters starting from Ga2O, it is found that the most stable isomers display up to four different backbones of constituent atoms. Furthermore, the predicted configuration of the ground state of Ga2O is recently confirmed by the experimental result of Neumark's group. Guided by the results of calculations the study of gallium oxide clusters, performance related challenge of computational simulations, of producing high performance computers/platforms, has been addressed. Several engineering aspects were thoroughly studied during the design, development and implementation of the high performance parallel computing platform, rama, at Michigan Tech. In an attempt to stay true to the principles of Beowulf revolutioni, the rama cluster was extensively customized to make it easy to understand, and use - for administrators as well as end-users. Following the results of benchmark calculations and to keep up with the complexity of systems under study, rama has been expanded to a total of sixty four processors. Interest in the non-covalent intereaction of DNA with carbon nanotubes has steadily increased during past several years. This hybrid system, at the junction of the biological regime and the nanomaterials world, possesses features which make it very attractive for a wide range of applicatioins. Using the in-house computational power available, we have studied details of the interaction between nucleic acid bases with graphene sheet as well as high-curvature small-diameter carbon nanotube. The calculated trend in the binding energies strongly suggests that the polarizability of the base molecules determines the interaction strength of the nucleic acid bases with graphene. When comparing the results obtained here for physisorption on the small diameter nanotube considered with those from the study on graphene, it is observed that the interaction strength of nucleic acid bases is smaller for the tube. Thus, these results show that the effect of introducing curvature is to reduce the binding energy. The binding energies for the two extreme cases of negligible curvature (i.e. flat graphene sheet) and of very high curvature (i.e. small diameter nanotube) may be considered as upper and lower bounds. This finding represents an important step towards a better understanding of experimentally observed sequence-dependent interaction of DNA with Carbon nanotubes.
Resumo:
Writing center scholarship and practice have approached how issues of identity influence communication but have not fully considered ways of making identity a key feature of writing center research or practice. This dissertation suggests a new way to view identity -- through an experience of "multimembership" or the consideration that each identity is constructed based on the numerous community memberships that make up that identity. Etienne Wenger (1998) proposes that a fully formed identity is ultimately impossible, but it is through the work of reconciling memberships that important individual and community transformations can occur. Since Wenger also argues that reconciliation "is the most significant challenge" for those moving into new communities of practice (or, "engage in a process of collective learning in a shared domain of human endeavor" (4)), yet this challenge often remains tacit, this dissertation examines and makes explicit how this important work is done at two different research sites - a university writing center (the Michigan Tech Multiliteracies Center) and at a multinational corporation (Kimberly-Clark Corporation). Drawing extensively on qualitative ethnographic methods including interview transcriptions, observations, and case studies, as well as work from scholars in writing center studies (Grimm, Denney, Severino), literacy studies (New London Group, Street, Gee), composition (Horner and Trimbur, Canagarajah, Lu), rhetoric (Crowley), and identity studies (Anzaldua, Pratt), I argue that, based on evidence from the two sites, writing centers need to educate tutors to not only take identity into consideration, but to also make individuals' reconciliation work more visible, as it will continue once students and tutors leave the university. Further, as my research at the Michigan Tech Multiliteracies Center and Kimberly-Clark will show, communities can (and should) change their practices in ways that account for reconciliation work as identity, communication, and learning are inextricably bound up with one another.
Resumo:
In the realm of computer programming, the experience of writing a program is used to reinforce concepts and evaluate ability. This research uses three case studies to evaluate the introduction of testing through Kolb's Experiential Learning Model (ELM). We then analyze the impact of those testing experiences to determine methods for improving future courses. The first testing experience that students encounter are unit test reports in their early courses. This course demonstrates that automating and improving feedback can provide more ELM iterations. The JUnit Generation (JUG) tool also provided a positive experience for the instructor by reducing the overall workload. Later, undergraduate and graduate students have the opportunity to work together in a multi-role Human-Computer Interaction (HCI) course. The interactions use usability analysis techniques with graduate students as usability experts and undergraduate students as design engineers. Students get experience testing the user experience of their product prototypes using methods varying from heuristic analysis to user testing. From this course, we learned the importance of the instructors role in the ELM. As more roles were added to the HCI course, a desire arose to provide more complete, quality assured software. This inspired the addition of unit testing experiences to the course. However, we learned that significant preparations must be made to apply the ELM when students are resistant. The research presented through these courses was driven by the recognition of a need for testing in a Computer Science curriculum. Our understanding of the ELM suggests the need for student experience when being introduced to testing concepts. We learned that experiential learning, when appropriately implemented, can provide benefits to the Computer Science classroom. When examined together, these course-based research projects provided insight into building strong testing practices into a curriculum.
Resumo:
The relation between theory and practice in social work has always been controversial. Recently, many have underlined how language is crucial in order to capture how knowledge is used in practice. This article introduces a language perspective to the issue, rooted in the ‘strong programme’ in the sociology of knowledge and in Wittgenstein’s late work. According to this perspective, the meaning of categories and concepts corresponds to the use that concrete actors make of them as a result of on-going negotiation processes in specific contexts. Meanings may vary dramatically across social groups moved by different interests and holding different cultures. Accordingly, we may reformulate the issue of theory and practice in terms of the connections between different language games and power relationship between segments of the professional community. In this view, the point is anyway to look at how theoretical language relates to practitioners’ broader frames, and how it is transformed while providing words for making sense of experience.
Resumo:
Several commentators have expressed disappointment with New Labour's apparent adherence to the policy frameworks of the previous Conservative administrations. The employment orientation of its welfare programmes, the contradictory nature of the social exclusion initiatives, and the continuing obsession with public sector marketisation, inspections, audits, standards and so on, have all come under critical scrutiny (c.f., Blyth 2001; Jordan 2001; Orme 2001). This paper suggests that in order to understand the socio-economic and political contexts affecting social work we need to examine the relationship between New Labour's modernisation project and its insertion within an architecture of global governance. In particular, membership of the European Union (EU), International Monetary Fund (IMF) and World Trade Organisation (WTO) set the parameters for domestic policy in important ways. Whilst much has been written about the economic dimensions of 'globalisation' in relation to social work rather less has been noted about the ways in which domestic policy agenda are driven by multilateral governance objectives. This policy dimension is important in trying to respond to various changes affecting social work as a professional activity. What is possible, what is encouraged, how things might be done, is tightly bounded by the policy frameworks governing practice and affected by those governing the lives of service users. It is unhelpful to see policy formulation in purely national terms as the UK is inserted into a network governance structure, a regulatory framework where decisions are made by many countries and organisations and agencies. Together, they are producing a 'new legal regime', characterised by a marked neo-liberal policy agenda. This paper aims to demonstrate the relationship of New Labour's modernisation programme to these new forms of legality by examining two main policy areas and the welfare implications they are enmeshed in. The first is privatisation, and the second is social policy in the European Union. Examining these areas allows a demonstration of how much of the New Labour programme can be understood as a local implementation of a transnational strategy, how parts of that strategy produce much of the social exclusion it purports to address, and how social welfare, and particularly social work, are noticeable by their absence within policy discourses of the strategy. The paper details how the privatisation programme is considered to be a crucial vehicle for the further development of a transnational political-economy, where capital accumulation has been redefined as 'welfare'. In this development, frameworks, codes and standards are central, and the final section of the paper examines how the modernisation strategy of the European Union depends upon social policy marked by an employment orientation and risk rationality, aimed at reconfiguring citizen identities.The strategy is governed through an 'open mode of coordination', in which codes, standards, benchmarks and so on play an important role. The paper considers the modernisation strategy and new legality within which it is embedded as dependent upon social policy as a technology of liberal governance, one demonstrating a new rationality in comparison to that governing post-Second World War welfare, and which aims to reconfigure institutional infrastructure and citizen identity.
Resumo:
Social work at global levels, and across international and intercultural divides, is probably more important now than ever before in our history. It may be that the very form our ideas about intercultural work take need to be re-examined in the light of recent global changes and uncertainties. In this short position paper I wish to offer some considerations about how we might approach the field of intercultural social work in order to gain new insights about how we practise at both local and global levels. For me, much of the promise of an intercultural social work (and for the purposes of this paper I see aspects of international social work in much the same light) lies in its focus on the way we categorise ourselves, our ideas and experiences in relation to others. The very notion of intercultural or international social work is based on assumptions about boundaries, differences, ways of differentiating and defining sets of experiences. Whether these are deemed "cultural" or "national" is of less importance. Once we are forced to examine these assumptions, about how and why we categorise ourselves in relation to other people in particular ways, the way is opened up for us to be much more critical about the bases of our own, often very deep-seated, thinking. This understanding, about how and why notions of "difference" operate in the way they do, can potentially open our understanding to all the other ways, besides cultural or national labelling, in which we categorise and create differences between ourselves and others. Intercultural social work, taken as a potential site for understanding the creation of difference then, has the potential to help us critically examine the bases of much of our practice in any setting, since most practice involves some kind of categorisation of phenomena.
Resumo:
This article details the American experience of welfare reform, and specifically its experience instituting workfare programs for participants. In the United States, the term "welfare" is most commonly used to refer to the program for single mothers and their families, formerly called Aid to Families with Dependent Children (AFDC) and now, Temporary Assistance to Needy Families (TANF). In 1996, politicians "ended welfare as we know it" by fundamentally changing this program with the passage of the Personal Responsibility and Work Opportunity Reconciliation Act of 1996 (PRWORA). The principal focus of the 1996 reform is mandatory work requirements enforced by sanctions and strict time limits on welfare receipt. While PRWORA's emphasis on work is not new, the difference is its significant ideological and policy commitment to employment, enforced by time limits. When welfare reform was enacted, some of its proponents recognized that welfare offices would have to change in order to develop individualized workfare plans, monitor progress, and impose sanctions. The "culture" of welfare offices had to be changed from being solely concerned with eligibility and compliance to individual, intensive casework. In this article, I will discuss how implementing workfare programs have influenced the relationship between clients and their workers at the welfare office. I start by describing the burdens faced by offices even before the enactment of welfare reform. Local welfare offices were expected to run programs that emphasized compliance and eligibility at the same time as workfare programs, which require intensive, personal case management. The next section of the paper will focus on strategies welfare offices and workers use to navigate these contradictory expectations. Lastly, I will present information on how clients react to workfare programs and some reasons they acquiesce to workfare contracts despite their unmet needs. I conclude with recommendations of how to make workfare truly work for welfare clients.
Resumo:
The Loss, grief and other problems are events that most of people experience them during their Life. The earthquake is a disaster that makes people experience loss, grief and problems simultaneously. This crisis affects on survivors as much as they face to dangers in their lives. Thus, most of them need to being supported until they can solve their problems, be relaxed and do their daily activities. We know that the profession of social workers is to assist individuals who are seeking help. But there is a Problem, how do they help the clients efficiently? Especially, those clients who have suffered earthquake. Generally, the role of social workers in helping the survivors of earthquake is significant. To this end, the present paper tries to describe the process of social casework and those skills required for social workers to help the survivors. These skills include: situational supporting, hopefulness making, consoling, assuring, concentrating, solutions developing and refer.
Resumo:
This paper proposes an extension to the televisionwatching paradigm that permits an end-user to enrich broadcast content. Examples of this enriched content are: virtual edits that allow the order of presentation within the content to be changed or that allow the content to be subsetted; conditional text, graphic or video objects that can be placed to appear within content and triggered by viewer interaction; additional navigation links that can be added to structure how other users view the base content object. The enriched content can be viewed directly within the context of the TV viewing experience. It may also be shared with other users within a distributed peer group. Our architecture is based on a model that allows the original content to remain unaltered, and which respects DRM restrictions on content reuse. The fundamental approach we use is to define an intermediate content enhancement layer that is based on the W3C’s SMIL language. Using a pen-based enhancement interface, end-users can manipulate content that is saved in a home PDR setting. This paper describes our architecture and it provides several examples of how our system handles content enhancement. We also describe a reference implementation for creating and viewing enhancements.