921 resultados para shot putting
Resumo:
Background: Trypanosoma evansi infections, commonly called 'surra', cause significant economic losses to livestock industry. While this infection is mainly restricted to large animals such as camels, donkeys and equines, recent reports indicate their ability to infect humans. There are no World Animal Health Organization (WAHO) prescribed diagnostic tests or vaccines available against this disease and the available drugs show significant toxicity. There is an urgent need to develop improved methods of diagnosis and control measures for this disease. Unlike its related human parasites T. brucei and T. cruzi whose genomes have been fully sequenced T. evansi genome sequence remains unavailable and very little efforts are being made to develop improved methods of prevention, diagnosis and treatment. With a view to identify potential diagnostic markers and drug targets we have studied the clinical proteome of T. evansi infection using mass spectrometry (MS).Methodology/Principal Findings: Using shot-gun proteomic approach involving nano-lc Quadrupole Time Of Flight (QTOF) mass spectrometry we have identified over 160 proteins expressed by T. evansi in mice infected with camel isolate. Homology driven searches for protein identification from MS/MS data led to most of the matches arising from related Trypanosoma species. Proteins identified belonged to various functional categories including metabolic enzymes; DNA metabolism; transcription; translation as well as cell-cell communication and signal transduction. TCA cycle enzymes were strikingly missing, possibly suggesting their low abundances. The clinical proteome revealed the presence of known and potential drug targets such as oligopeptidases, kinases, cysteine proteases and more.Conclusions/Significance: Previous proteomic studies on Trypanosomal infections, including human parasites T. brucei and T. cruzi, have been carried out from lab grown cultures. For T. evansi infection this is indeed the first ever proteomic study reported thus far. In addition to providing a glimpse into the biology of this neglected disease, our study is the first step towards identification of diagnostic biomarkers, novel drug targets as well as potential vaccine candidates to fight against T. evansi infections.
Resumo:
The optimal design of a multiproduct batch chemical plant is formulated as a multiobjective optimization problem, and the resulting constrained mixed-integer nonlinear program (MINLP) is solved by the nondominated sorting genetic algorithm approach (NSGA-II). By putting bounds on the objective function values, the constrained MINLP problem can be solved efficiently by NSGA-II to generate a set of feasible nondominated solutions in the range desired by the decision-maker in a single run of the algorithm. The evolution of the entire set of nondominated solutions helps the decision-maker to make a better choice of the appropriate design from among several alternatives. The large set of solutions also provides a rich source of excellent initial guesses for solution of the same problem by alternative approaches to achieve any specific target for the objective functions
Resumo:
An approximate dynamic programming (ADP) based neurocontroller is developed for a heat transfer application. Heat transfer problem for a fin in a car's electronic module is modeled as a nonlinear distributed parameter (infinite-dimensional) system by taking into account heat loss and generation due to conduction, convection and radiation. A low-order, finite-dimensional lumped parameter model for this problem is obtained by using Galerkin projection and basis functions designed through the 'Proper Orthogonal Decomposition' technique (POD) and the 'snap-shot' solutions. A suboptimal neurocontroller is obtained with a single-network-adaptive-critic (SNAC). Further contribution of this paper is to develop an online robust controller to account for unmodeled dynamics and parametric uncertainties. A weight update rule is presented that guarantees boundedness of the weights and eliminates the need for persistence of excitation (PE) condition to be satisfied. Since, the ADP and neural network based controllers are of fairly general structure, they appear to have the potential to be controller synthesis tools for nonlinear distributed parameter systems especially where it is difficult to obtain an accurate model.
Resumo:
Tinnitus is a frequent consequence of noise trauma. Usually, however, the main focus regarding the consequences of noise trauma is placed on hearing loss, instead of tinnitus. The objectives of the present study were to assess various aspects of noise-related tinnitus in Finland, such as to determine the main causes of conscript acute acoustic traumas (AAT) in the military, assess tinnitus prevalence after noise trauma, characterize long-term AAT-related tinnitus prevalence and characteristics, assess occupational tinnitus, and evaluate the efficacy of hearing protection regulations in preventing hearing loss and tinnitus. The study comprised several independent noise-exposed groups: conscripts performing their military duty, former conscripts who suffered an AAT over a decade earlier, bomb explosion victims, and retired army personnel. Tinnitus questionnaires were used to assess tinnitus prevalence and characteristics. For occupational tinnitus, occupational noise-induced hearing loss (NIHL) reports to the Finnish Institute of Occupational Health were reviewed. Tinnitus is a common result of AAT, blast exposure and long-term noise exposure. Despite hearing protection regulations, up to hundreds of AATs occur annually among conscripts in the Finnish Defence Forces (FDF). The most common cause is an accidental shot, accounting for approximately half of the cases. Conscript AATs are mainly due to accidental shots, while the ear is unprotected. Only seldom is an AAT due to negligence. The most common causative weapon of conscript AATs is the assault rifle, accounting for 81% of conscript AATs. After AAT, the majority of tinnitus cases resolve during military service and become asymptomatic. However, in one-fifth of the cases, tinnitus persists, causing problems such as sleeping and concentration difficulties in many. In Finland, occupational tinnitus often remains unreported in conjunction with NIHL reports. In a survey of occupational NIHL cases, tinnitus was mentioned in only four per cent. However, a subsequent inquiry revealed that almost 90% in fact had tinnitus, indicating that most cases remained undetected and unreported. The best way to prevent noise-related tinnitus is prevention of noise trauma. In the military, hearing protection guidelines have been revised several times over the years. These regulations have been effective in reducing hearing loss of professional soldiers. There has also been a reduction in cases with tinnitus, but the decrease was not significant. However, with improved hearing protection regulations, a significant reduction in the risk of more serious, disturbing tinnitus was observed.
Resumo:
The Master’s thesis examines whether and how decolonial cosmopolitanism is empirically traceable in the attitudes and practices of Costa Rican activists working in transnational advocacy organizations. Decolonial cosmopolitanism is defined as a form of cosmopolitanism from below that aims to propose ways of imagining – and putting into practice – a truly globe-encompassing civic community not based on relations of domination but on horizontal dialogue. This concept has been developed by and shares its basic presumptions with the theory on coloniality that the modernity/coloniality/decoloniality research group is putting forward. It is analyzed whether and how the workings of coloniality as underlying ontological assumption of decolonial cosmopolitanism and broadly subsumable under the three logics of race, capitalism, and knowledge, are traceable in intermediate postcolonial transnational advocacy in Costa Rica. The method of analysis chosen to approach these questions is content analysis, which is used for the analysis of qualitative semi-structured in-depth interviews with Costa Rican activists working in advocacy organizations with transnational ties. Costa Rica was chosen as it – while unquestionably a Latin American postcolonial country and thus within the geo-political context in which the concept was developed – introduces a complex setting of socio-cultural and political factors that put the explanatory potential of the concept to the test. The research group applies the term ‘coloniality’ to describe how the social, political, economic, and epistemic relations developed during the colonization of the Americas order global relations and sustain Western domination still today through what is called the logic of coloniality. It also takes these processes as point of departure for imagining how counter-hegemonic contestations can be achieved through the linking of local struggles to a global community that is based on pluriversality. The issues that have been chosen as most relevant expressions of the logic of coloniality in the context of Costa Rican transnational advocacy and that are thus empirically scrutinized are national identity as ‘white’ exceptional nation with gender equality (racism), the neoliberalization of advocacy in the Global South (capitalism), and finally Eurocentrism, but also transnational civil society networks as first step in decolonizing civic activism (epistemic domination). The findings of this thesis show that the various ways in which activists adopt practices and outlooks stemming from the center in order to empower themselves and their constituencies, but also how their particular geo-political position affects their work, cannot be reduced to one single logic of coloniality. Nonetheless, the aspects of race, gender, capitalism and epistemic hegemony do undeniably affect activist cosmopolitan attitudes and transnational practices. While the premisses on which the concept of decolonial cosmopolitanism is based suffer from some analytical drawbacks, its importance is seen in its ability to take as point of departure the concrete spaces in which situated social relations develop. It thus allows for perceiving the increasing interconnectedness between different levels of social and political organizing as contributing to cosmopolitan visions combining local situatedness with global community as normative horizon that have not only influenced academic debate, but also political projects.
Resumo:
This thesis studies optimisation problems related to modern large-scale distributed systems, such as wireless sensor networks and wireless ad-hoc networks. The concrete tasks that we use as motivating examples are the following: (i) maximising the lifetime of a battery-powered wireless sensor network, (ii) maximising the capacity of a wireless communication network, and (iii) minimising the number of sensors in a surveillance application. A sensor node consumes energy both when it is transmitting or forwarding data, and when it is performing measurements. Hence task (i), lifetime maximisation, can be approached from two different perspectives. First, we can seek for optimal data flows that make the most out of the energy resources available in the network; such optimisation problems are examples of so-called max-min linear programs. Second, we can conserve energy by putting redundant sensors into sleep mode; we arrive at the sleep scheduling problem, in which the objective is to find an optimal schedule that determines when each sensor node is asleep and when it is awake. In a wireless network simultaneous radio transmissions may interfere with each other. Task (ii), capacity maximisation, therefore gives rise to another scheduling problem, the activity scheduling problem, in which the objective is to find a minimum-length conflict-free schedule that satisfies the data transmission requirements of all wireless communication links. Task (iii), minimising the number of sensors, is related to the classical graph problem of finding a minimum dominating set. However, if we are not only interested in detecting an intruder but also locating the intruder, it is not sufficient to solve the dominating set problem; formulations such as minimum-size identifying codes and locating–dominating codes are more appropriate. This thesis presents approximation algorithms for each of these optimisation problems, i.e., for max-min linear programs, sleep scheduling, activity scheduling, identifying codes, and locating–dominating codes. Two complementary approaches are taken. The main focus is on local algorithms, which are constant-time distributed algorithms. The contributions include local approximation algorithms for max-min linear programs, sleep scheduling, and activity scheduling. In the case of max-min linear programs, tight upper and lower bounds are proved for the best possible approximation ratio that can be achieved by any local algorithm. The second approach is the study of centralised polynomial-time algorithms in local graphs – these are geometric graphs whose structure exhibits spatial locality. Among other contributions, it is shown that while identifying codes and locating–dominating codes are hard to approximate in general graphs, they admit a polynomial-time approximation scheme in local graphs.
Resumo:
This study describes how students influence their possibilities of participating in whole-class conversation. The main objective is to investigate the verbal and non-verbal resources used by students to modify the participant roles of the ongoing conversation. The resources studied are attention-getting devices such as hand-raising and address terms, recycling and other forms of collaborative talk, means of reference to persons, such as pronouns, as well as gaze and other embodied resources. The theoretical and methodological framework adopted in this study is that of conversation analysis. The data consist of ten videotaped lessons of Finnish as a second language in three secondary schools (grades 7 9) in southern Finland; the number of students per group varies from five to ten. Finnish has a triple role in the data as the medium of teaching, the target language, and the lingua franca used by the participants. The findings show that the multi-party context of the classroom conversation is both a disadvantage and an affordance for student participation. The students possess multiple tools to overcome and deal with the encumbrances posed by the large number of participants. They combine various techniques in order to actively compete for public turns, and they monitor the ongoing conversation carefully to adjust their participation to the activities of other participants. Sometimes the whole-class conversation splits into two separate conversations, but participants usually orient to the overlapping nature of the talk and tend to bring the conversations together rapidly. On the other hand, students skilfully make use of other participants and their talk to increase and diversify their own possibilities to participate. For example, they recycle elements of each other s turns or refer to the currently speaking student in order to gain access to the conversation. Students interact with each other even during the public whole-class conversation. Students orient to one another often even when talking to the teacher, but they also address talk directly to one another, as part of the public conversation. In this way students increase each other s possibilities of participation. The interaction is constantly multi-layered: in addition to the pedagogic agenda, the students orient to social goals, for example, by teasing each other and putting on humorous performances for their peer audience. The student student participation arises spontaneously from a genuine need to communicate and thus represents authentic language use: by talking to each other, often playfully, the students appropriate Finnish vocabulary, grammar, and expressions. In this way the structure of the interaction reflects the particular nature of Finnish as a second language lessons: all talk serves the pedagogic goal of enabling students to communicate in the target language.
Resumo:
We present noise measurements of a phase fluorometric oxygen sensor that sets the limits of accuracy for this instrument. We analyze the phase sensitive detection measurement system with the signal ''shot'' noise being the only significant contribution to the system noise. Based on the modulated optical power received by the photomultiplier, the analysis predicts a noise spectral power density that was within 3 dB of the measured power spectral noise density. Our results demonstrate that at a received optical power of 20 fW the noise level was low enough to permit the detection of a change oxygen concentration of 1% at the sensor. We also present noise measurements of a new low-cost version of this instrument that uses a photodiode instead of a photomultiplier. These measurements show that the noise for this instrument was limited by noise generated in the preamplifier following the photodiode. (C) 1996 Society of Photo-Optical Instrumentation Engineers.
Resumo:
Electronic Exchanges are double-sided marketplaces that allows multiple buyers to trade with multiple sellers, with aggregation of demand and supply across the bids to maximize the revenue in the market. In this paper, we propose a new design approach for an one-shot exchange that collects bids from buyers and sellers and clears the market at the end of the bidding period. The main principle of the approach is to decouple the allocation from pricing. It is well known that it is impossible for an exchange with voluntary participation to be efficient and budget-balanced. Budget-balance is a mandatory requirement for an exchange to operate in profit. Our approach is to allocate the trade to maximize the reported values of the agents. The pricing is posed as payoff determination problem that distributes the total payoff fairly to all agents with budget-balance imposed as a constraint. We devise an arbitration scheme by axiomatic approach to solve the payoff determination problem using the added-value concept of game theory.
Resumo:
A detailed study of surface laser damage performed on a nonlinear optical crystal, urea L-malic acid, using 7 ns laser pulses at 10 Hz repetition rate from a Q-switched Nd:YAG laser at wavelengths of 532 and 1064 nm is reported. The single shot and multiple shot surface laser damage threshold values are determined to be 26.64±0.19 and 20.60±0.36 GW cm−2 at 1064 nm and 18.44±0.31 and 7.52±0.22 GW cm−2 at 532 nm laser radiation, respectively. The laser damage anisotropy is consistent with the Vickers mechanical hardness measurement performed along three crystallographic directions. The Knoop polar plot also reflects the damage morphology. Our investigation reveals a direct correlation between the laser damage profile and hardness anisotropy. Thermal breakdown of the crystal is identified as the possible mechanism of laser induced surface damage.
Resumo:
In this paper, we have studied the effect of gate-drain/source overlap (LOV) on the drain channel noise and induced gate current noise (SIg) in 90 nm N-channel metal oxide semiconductor field effect transistors using process and device simulations. As the change in overlap affects the gate tunneling leakage current, its effect on shot noise component of SIg has been taken into consideration. It has been shown that “control over LOV” allows us to get better noise performance from the device, i.e., it allows us to reduce noise figure, for a given leakage current constraint. LOV in the range of 0–10 nm is recommended for the 90 nm gate length transistors, in order to get the best performance in radio frequency applications.
Resumo:
Electronic exchanges are double-sided marketplaces that allow multiple buyers to trade with multiple sellers, with aggregation of demand and supply across the bids to maximize the revenue in the market. Two important issues in the design of exchanges are (1) trade determination (determining the number of goods traded between any buyer-seller pair) and (2) pricing. In this paper we address the trade determination issue for one-shot, multi-attribute exchanges that trade multiple units of the same good. The bids are configurable with separable additive price functions over the attributes and each function is continuous and piecewise linear. We model trade determination as mixed integer programming problems for different possible bid structures and show that even in two-attribute exchanges, trade determination is NP-hard for certain bid structures. We also make some observations on the pricing issues that are closely related to the mixed integer formulations.
Resumo:
Third World hinterlands provide most of the settings in which the quality of human life has improved the least over the decade since Our Common Future was published. This low quality of life promotes a desire for large number of offspring, fuelling population growth and an exodus to the urban centres of the Third World, Enhancing the quality of life of these people in ways compatible with the health of their environments is therefore the most significant of the challenges from the perspective of sustainable development. Human quality of life may be viewed in terms of access to goods, services and a satisfying social role. The ongoing processes of globalization are enhancing flows of goods worldwide, but these hardly reach the poor of Third World countrysides. But processes of globalization have also vastly improved everybody's access to Information, and there are excellent opportunities of putting this to good use to enhance the quality of life of the people of Third World countrysides through better access to education and health. More importantly, better access to information could promote a more satisfying social role through strengthening grass-roots involvement in development planning and management of natural resources. I illustrate these possibilities with the help of a series of concrete experiences form the south Indian state of Kerala. Such an effort does not call for large-scare material inputs, rather it calls for a culture of inform-and-share in place place of the prevalent culture of control-and-command. It calls for openness and transparency in transactions involving government agencies, NGOs, and national and transnational business enterprises. It calls for acceptance of accountability by such agencies.
Resumo:
The use of high-velocity sheet-forming techniques where the strain rates are in excess of 10(2)/s can help us solve many problems that are difficult to overcome with traditional metal-forming techniques. In this investigation, thin metallic plates/foils were subjected to shock wave loading in the newly developed diaphragmless shock tube. The conventional shock tube used in the aerodynamic applications uses a metal diaphragm for generating shock waves. This method of operation has its own disadvantages including the problems associated with repeatable and reliable generation of shock waves. Moreover, in industrial scenario, changing metal diaphragms after every shot is not desirable. Hence, a diaphragmless shock tube is calibrated and used in this study. Shock Mach numbers up to 3 can be generated with a high degree of repeatability (+/- 4 per cent) for the pressure jumps across the primary shock wave. The shock Mach number scatter is within +/- 1.5 per cent. Copper, brass, and aluminium plates of diameter 60 mm and thickness varying from 0.1 to 1 mm are used. The plate peak over-pressures ranging from 1 to 10 bar are used. The midpoint deflection, circumferential, radial, and thickness strains are measured and using these, the Von Mises strain is also calculated. The experimental results are compared with the numerical values obtained using finite element analysis. The experimental results match well with the numerical values. The plastic hinge effect was also observed in the finite element simulations. Analysis of the failed specimens shows that aluminium plates had mode I failure, whereas copper plates had mode II failure.
Resumo:
The advent and evolution of geohazard warning systems is a very interesting study. The two broad fields that are immediately visible are that of geohazard evaluation and subsequent warning dissemination. Evidently, the latter field lacks any systematic study or standards. Arbitrarily organized and vague data and information on warning techniques create confusion and indecision. The purpose of this review is to try and systematize the available bulk of information on warning systems so that meaningful insights can be derived through decidable flowcharts, and a developmental process can be undertaken. Hence, the methods and technologies for numerous geohazard warning systems have been assessed by putting them into suitable categories for better understanding of possible ways to analyze their efficacy as well as shortcomings. By establishing a classification scheme based on extent, control, time period, and advancements in technology, the geohazard warning systems available in any literature could be comprehensively analyzed and evaluated. Although major advancements have taken place in geohazard warning systems in recent times, they have been lacking a complete purpose. Some systems just assess the hazard and wait for other means to communicate, and some are designed only for communication and wait for the hazard information to be provided, which usually is after the mishap. Primarily, systems are left at the mercy of administrators and service providers and are not in real time. An integrated hazard evaluation and warning dissemination system could solve this problem. Warning systems have also suffered from complexity of nature, requirement of expert-level monitoring, extensive and dedicated infrastructural setups, and so on. The user community, which would greatly appreciate having a convenient, fast, and generalized warning methodology, is surveyed in this review. The review concludes with the future scope of research in the field of hazard warning systems and some suggestions for developing an efficient mechanism toward the development of an automated integrated geohazard warning system. DOI: 10.1061/(ASCE)NH.1527-6996.0000078. (C) 2012 American Society of Civil Engineers.