949 resultados para massive vectorial boson
Resumo:
China has a massive population of children with disabilities. To address the special needs of these children, special/inclusive education in China has developed dramatically since the early 1980s onwards. This Special Issue puts together seven empirical studies emerging from the Chinese societies. These studies analyse inclusive discourses embedded in the education policy documents; scrutinise professional competence of inclusive education teachers; evaluate inclusive education practices in physical education, mathematics education, and job-related social skills education provided to students with disabilities; debate the required in-class support for inclusive education teachers; and discuss the social attitudes towards people with disabilities. The foci, methods and theories vary across the seven studies, while their aims converge. These studies are seeking best possible approaches and best available resources that facilitate inclusion. Knowledge built and lessons learned from these studies will provide implications for future inclusive education practices in China and beyond.
Resumo:
Background To date bone-anchored prostheses are used to alleviate the concerns caused by socket suspended prostheses and to improve the quality of life of transfemoral amputees (TFA). Currently, two implants are commercially available (i.e., OPRA (Integrum AB, Sweden), ILP (Orthodynamics GmbH, Germany)). [1-17]The success of the OPRA technique is codetermined by the rehabilitation program. TFA fitted with an osseointegrated implant perform progressive mechanical loading (i.e. static load bearing exercises (LBE)) to facilitate bone remodelling around the implant.[18, 19] Aim This study investigated the trustworthiness of monitoring the load prescribed (LP) during experimental static LBEs using the vertical force provided by a mechanical bathroom scale that is considered a surrogate of the actual load applied. Method Eleven unilateral TFAs fitted with an OPRA implant performed five trials in four loading conditions. The forces and moments on the three axes of the implant were measured directly with an instrumented pylon including a six-channel transducer. The “axial” and “vectorial” comparisons corresponding to the difference between the force applied on the long axis of the fixation and LP as well as the resultant of the three components of the load applied and LP, respectively were analysed Results For each loading condition, Wilcoxon One-Sample Signed Rank Tests were used to investigate if significant differences (p<0.05) could be demonstrated between the force applied on the long axis and LP, and between the resultant of the force and LP. The results demonstrated that the raw axial and vectorial differences were significantly different from zero in all conditions (p<0.05), except for the vectorial difference for the 40 kg loading condition (p=0.182). The raw axial difference was negative for all the participants in every loading condition, except for TFA03 in the 10 kg condition (11.17 N). Discussion & Conclusion This study showed a significant lack of axial compliance. The load applied on the long axis was significantly smaller than LP in every loading condition. This led to a systematic underloading of the long axis of the implant during the proposed experimental LBE. Monitoring the vertical force might be only partially reflective of the actual load applied, particularly on the long axis of the implant.
Resumo:
The successful completion of the Human Genome Project (HGP) was an unprecedented scientific advance that has become an invaluable resource in the search for genes that cause monogenic and common (polygenic) diseases. Prior to the HGP, linkage analysis had successfully mapped many disease genes for monogenic disorders; however, the limitations of this approach were particularly evident for identifying causative genes in rare genetic disorders affecting lifespan and/or reproductive fitness, such as skeletal dysplasias. In this review, we illustrate the challenges of mapping disease genes in such conditions through the ultra-rare disorder fibrodysplasia ossificans progressiva (FOP) and we discuss the advances that are being made through current massively parallel (“next generation”) sequencing (MPS) technologies.
Resumo:
The Artificial Neural Networks (ANNs) are being used to solve a variety of problems in pattern recognition, robotic control, VLSI CAD and other areas. In most of these applications, a speedy response from the ANNs is imperative. However, ANNs comprise a large number of artificial neurons, and a massive interconnection network among them. Hence, implementation of these ANNs involves execution of computer-intensive operations. The usage of multiprocessor systems therefore becomes necessary. In this article, we have presented the implementation of ART1 and ART2 ANNs on ring and mesh architectures. The overall system design and implementation aspects are presented. The performance of the algorithm on ring, 2-dimensional mesh and n-dimensional mesh topologies is presented. The parallel algorithm presented for implementation of ART1 is not specific to any particular architecture. The parallel algorithm for ARTE is more suitable for a ring architecture.
Resumo:
With the explosion of information resources, there is an imminent need to understand interesting text features or topics in massive text information. This thesis proposes a theoretical model to accurately weight specific text features, such as patterns and n-grams. The proposed model achieves impressive performance in two data collections, Reuters Corpus Volume 1 (RCV1) and Reuters 21578.
Resumo:
Japan is in the midst of massive law reform. Mired in ongoing recession since the early 1990s, Japan has been implementing a new regulatory blueprint to kickstart a sluggish economy through structural change. A key element to this reform process is a rethink of corporate governance and its stakeholder relations. With a patchwork of legislative initiatives in areas as diverse as corporate law, finance, labour relations, consumer protection, public administration and civil justice, this new model is beginning to take shape. But to what extent does this model represent a break from the past? Some commentators are breathlessly predicting the "Americanisation" of Japanese law. They see the triumph of Western-style capitalism - the "End of History", to borrow the words of Francis Fukuyama - with its emphasis on market-based, arms-length transactions. Others are more cautious, advancing the view that there new reforms are merely "creative twists" on what is a uniquely (although slowly evolving) strand of Japanese capitalism. This paper takes issue with both interpretations. It argues that the new reforms merely follow Japan's long tradition of 'adopting and adapting' foreign models to suit domestic purposes. They are neither the wholesale importation of "Anglo-Saxon" regulatory principles nor a thin veneer over a 'uniquely unique' form of Confucian cultural capitalism. Rather, they represent a specific and largely political solution (conservative reformism) to a current economic problem (recession). The larger themes of this paper are 'change' and 'continuity'. 'Change' suggests evolution to something identifiable; 'continuity' suggests adhering to an existing state of affairs. Although notionally opposites, 'change' and 'continuity' have something in common - they both suggest some form of predictability and coherence in regulatory reform. Our paper, by contrast, submits that Japanese corporate governance reform or, indeed, law reform more generally in Japan, is context-specific, multi-layered (with different dimensions not necessarily pulling all in the same direction for example, in relations with key outside suppliers), and therefore more random or 'chaotic'.
Resumo:
India possesses a diverse and rich cultural heritage and is renowned as a 'land of festivals'. These festivals attract massive community involvement paving way to new materials such as 'Plaster of Paris' being used for 'modernizing' the representation of idols with very little thought given to the issues of toxicity and environmental impacts. Another dimension to the whole issue is the plight of the artisans and the workers involved in the trade. Owing to the unorganized nature of the industry there is minimal or no guidelines pertaining-to the worker safety and health risks of the people involved. This paper attempts to address the complexities of the inherent hazards as a consequence of these socio-environmental issues and trace the scientific rationale in addressing them in a practical and pragmatic way.
Resumo:
Due to the increasing speed of landscape changes and the massive development of computer technologies, the methods of representing heritage landscapes using digital tools have become a worldwide concern in conservation research. The aim of this paper is to demonstrate how an ‘interpretative model’ can be used for contextual design of heritage landscape information systems. This approach is explored through building a geographic information system database for St Helena Island national park in Moreton Bay, South East Queensland, Australia. Stakeholders' interpretations of this landscape were collected through interviews, and then used as a framework for designing the database. The designed database is a digital inventory providing contextual descriptions of the historic infrastructure remnants on St Helena Island. It also reveals the priorities of different sites in terms of historic research, landscape restoration, and tourism development. Additionally, this database produces thematic maps of the intangible heritage values, which could be used for landscape interpretation. This approach is different from the existing methods because building a heritage information system is deemed as an interpretative activity, rather than a value-free replication of the physical environment. This approach also shows how a cultural landscape methodology can be used to create a flexible information system for heritage conservation. The conclusion is that an ‘interpretative model’ of database design facilitates a more explicit focus on information support, and is a potentially effective approach to user-centred design of geographic information systems.
Resumo:
The concept of the American Dream was subject to a strong re-evaluation process in the 1960s, as counterculture became a prominent force in American society. A massive generation of young people, moved by the Vietnam War, the hippie movement, and psychedelic experimentation, created substantial social turbulence in their efforts to break out of conventional patterns and to create a new kind of society. This thesis outlines and analyses the concept of the American Dream in popular imagination through three works of new journalism. My primary data consists of Tom Wolfe’s The Electric Kool-Aid Acid Test (1967), Hunter S. Thompson’s Fear and Loathing in Las Vegas: A Savage Journey to the Heart of the American Dream (1971), and Norman Mailer’s Armies of the Night: History as a Novel, the Novel as History (1968). In defining the American Dream, I discuss the history of the concept as well as its manifestations in popular culture. Because of its elusive and amorphous nature, the concept of the American Dream can only be examined in cultural texts that portray the values, sentiments, and customs of a certain era. I have divided the analytical section of my thesis into three parts. In the first part I examine how the authors discuss the American society of their time in relation to ideology, capitalism, and the media. In the second part I focus on the Vietnam War and the controversy it creates in relation to the notions of freedom and patriotism. In the third part I discuss how the authors portray the countercultural visions of a better America that challenged the traditional interpretations of the American Dream. I also discuss the dark side of the new dream: the problems and disillusions that came with the effort to change the world. This thesis is an effort to trace the relocation of the American Dream in the context of the 1960s counterculture and new journalism. It hopes to provide a valuable addition to the cultural history of the sixties and to the effort of conceptualizing the American Dream.
Resumo:
Based on maps of the extragalactic radio sources Cyg A, Her A, Cen A, 3C 277.3 and others, arguments are given that the twin-jets from the respective active galactic nucleus ram their channels repeatedly through thin, massive shells. The jets are thereby temporarily choked and blow radio bubbles. Warm shell matter in the cocoon shows up radio-dark through electron-scattering.
Resumo:
Introduction: Decompressive hemicraniectomy, clot evacuation, and aneurysmal interventions are considered aggressive surgical therapeutic options for treatment of massive cerebral artery infarction (MCA), intracerebral hemorrhage (ICH), and severe subarachnoid hemorrhage (SAH) respectively. Although these procedures are saving lives, little is actually known about the impact on outcomes other than short-term survival and functional status. The purpose of this study was to gain a better understanding of personal and social consequences of surviving these aggressive surgical interventions in order to aid acute care clinicians in helping family members make difficult decisions about undertaking such interventions. Methods: An exploratory mixed method study using a convergent parallel design was conducted to examine functional recovery (NIHSS, mRS & BI), cognitive status (Montreal Cognitive Assessment Scale, MoCA), quality of life (Euroqol 5-D), and caregiver outcomes (Bakas Caregiver Outcome Scale, BCOS) in a cohort of patients and families who had undergone aggressive surgical intervention for severe stroke between the years 2000–2007. Data were analyzed using descriptive statistics, univariate and multivariate analysis of variance, and multivariate logistic regression. Content analysis was used to analyze the qualitative interviews conducted with stroke survivors and family members. Results: Twenty-seven patients and 13 spouses participated in this study. Based on patient MOCA scores, overall cognitive status was 25.18 (range 23.4-26.9); current functional outcomes scores: NIHSS 2.22, mRS 1.74, and BI 88.5. EQ-5D scores revealed no significant differences between patients and caregivers (p=0.585) and caregiver outcomes revealed no significant differences between male/female caregivers or patient diagnostic group (MCA, SAH, ICH; p=""0.103).<"/span><"/span> Discussion: Overall, patients and families were satisfied with quality of life and decisions made at the time of the initial stroke. There was consensus among study participants that formal community-based support (e.g., handibus, caregiving relief, rehabilitation assessments) should be continued for extended periods (e.g., years) post-stroke. Ongoing contact with health care professionals is valuable to help them navigate in the community as needs change over time.
Resumo:
Tolerance of Noise as a Necessity of Urban Life. Noise pollution as an environmental problem and its cultural perceptions in the city of Helsinki This study looks at the noise pollution problem and the change in the urban soundscape in the city of Helsinki during the period from the 1950s to the present day. The study investigates the formation of noise problems, the politicization of the noise pollution problem, noise-related civic activism, the development of environmental policies on noise, and the expectations that urban dwellers have had concerning their everyday soundscape. Both so-called street noise and the noise caused by, e.g., neighbors are taken into account. The study investigates whether our society contains or has for some time contained cultural and other elements that place noise pollution as an essential or normal state of affairs as part of urban life. It is also discussed whether we are moving towards an artificial soundscape, meaning that the auditory reality, the soundscape, is more and more under human control. The concept of an artificial soundscape was used to crystallize the significance of human actions and the role of modern technology in shaping soundscapes and also to link the changes in the modern soundscape to the economic, political, and social changes connected to the modernization process. It was argued that the critical period defining noise pollution as an environmental problem were the years from the end of the 1960s to the early 1970s. It seems that the massive increase of noise pollution caused by road traffic and the introduction of the utopian traffic plans was the key point that launched the moral protest against the increase of noise pollution, and in general, against the basic structures and mindsets of society, including attitudes towards nature. The study argues that after noise pollution was politicized and institutionalized, the urban soundscape gradually became the target of systematic interventions. However, for various reasons, such as the inconsistency in decision making, our increased capacity to shape the soundscape has not resulted in a healthy or pleasant urban soundscape. In fact the number of people exposed to noise pollution is increasing. It is argued that our society contains cultural and other elements that urge us to see noise as a normal part of urban life. It is also argued that the possibility of experiencing natural, silent soundscapes seems to be the yardstick against which citizens of Helsinki have measured how successful we are in designing the (artificial) soundscape and if the actions of noise control have been effective. This work discusses whose interests it serves when we are asked to accept noise pollution as a normal state of affairs. It is also suggested that the quality of the artificial soundscape ought to be radically politicized, which might give all citizens a better and more equal chance to express their needs and wishes concerning the urban soudscape, and also to decide how it ought to be designed.
Resumo:
In this study I discuss G. W. Leibniz's (1646-1716) views on rational decision-making from the standpoint of both God and man. The Divine decision takes place within creation, as God freely chooses the best from an infinite number of possible worlds. While God's choice is based on absolutely certain knowledge, human decisions on practical matters are mostly based on uncertain knowledge. However, in many respects they could be regarded as analogous in more complicated situations. In addition to giving an overview of the divine decision-making and discussing critically the criteria God favours in his choice, I provide an account of Leibniz's views on human deliberation, which includes some new ideas. One of these concerns is the importance of estimating probabilities in making decisions one estimates both the goodness of the act itself and its consequences as far as the desired good is concerned. Another idea is related to the plurality of goods in complicated decisions and the competition this may provoke. Thirdly, heuristic models are used to sketch situations under deliberation in order to help in making the decision. Combining the views of Marcelo Dascal, Jaakko Hintikka and Simo Knuuttila, I argue that Leibniz applied two kinds of models of rational decision-making to practical controversies, often without explicating the details. The more simple, traditional pair of scales model is best suited to cases in which one has to decide for or against some option, or to distribute goods among parties and strive for a compromise. What may be of more help in more complicated deliberations is the novel vectorial model, which is an instance of the general mathematical doctrine of the calculus of variations. To illustrate this distinction, I discuss some cases in which he apparently applied these models in different kinds of situation. These examples support the view that the models had a systematic value in his theory of practical rationality.
Resumo:
Several channels provided by many-body couplings — both fermion-fermion and fermion-boson — for the evolution of the chemisorption system are discussed. This provides an opportunity of a systematic study of the effects of correlations reflected through the intricate pole structure of the absorbate Green functions. The results of Newns, Anda and others in the context of chemisorption are generalized.
Resumo:
The need for reexamination of the standard model of strong, weak, and electromagnetic interactions is discussed, especially with regard to 't Hooft's criterion of naturalness. It has been argued that theories with fundamental scalar fields tend to be unnatural at relatively low energies. There are two solutions to this problem: (i) a global supersymmetry, which ensures the absence of all the naturalness-violating effects associated with scalar fields, and (ii) composite structure of the scalar fields, which starts showing up at energy scales where unnatural effects would otherwise have appeared. With reference to the second solution, this article reviews the case for dynamical breaking of the gauge symmetry and the technicolor scheme for the composite Higgs boson. This new interaction, of the scaled-up quantum chromodynamic type, keeps the new set of fermions, the technifermions, together in the Higgs particles. It also provides masses for the electroweak gauge bosons W± and Z0 through technifermion condensate formation. In order to give masses to the ordinary fermions, a new interaction, the extended technicolor interaction, which would connect the ordinary fermions to the technifermions, is required. The extended technicolor group breaks down spontaneously to the technicolor group, possibly as a result of the "tumbling" mechanism, which is discussed here. In addition, the author presents schemes for the isospin breaking of mass matrices of ordinary quarks in the technicolor models. In generalized technicolor models with more than one doublet of technifermions or with more than one technicolor sector, we have additional low-lying degrees of freedom, the pseudo-Goldstone bosons. The pseudo-Goldstone bosons in the technicolor model of Dimopoulos are reviewed and their masses computed. In this context the vacuum alignment problem is also discussed. An effective Lagrangian is derived describing colorless low-lying degrees of freedom for models with two technicolor sectors in the combined limits of chiral symmetry and large number of colors and technicolors. Finally, the author discusses suppression of flavor-changing neutral currents in the extended technicolor models.