989 resultados para hard-kill methods
Resumo:
Energy usage in general, and electricity usage in particular, are major concerns internationally due to the increased cost of providing energy supplies and the environmental impacts of electricity generation using carbon-based fuels. If a "systems" approach is taken to understanding energy issues then both supply and demand need to be considered holistically. This paper examines two research projects in the energy area with IT tools as key deliverables, one examining supply issues and the other studying demand side issues. The supply side project used hard engineering methods to build the models and software, while the demand side project used a social science approach. While the projects are distinct, there was an overlap in personnel. Comparing the knowledge extraction, model building, implementation and interface issues of these two deliverables identifies both interesting contrasts and commonalities.
Resumo:
The most difficult operation in the flood inundation mapping using optical flood images is to separate fully inundated areas from the ‘wet’ areas where trees and houses are partly covered by water. This can be referred as a typical problem the presence of mixed pixels in the images. A number of automatic information extraction image classification algorithms have been developed over the years for flood mapping using optical remote sensing images. Most classification algorithms generally, help in selecting a pixel in a particular class label with the greatest likelihood. However, these hard classification methods often fail to generate a reliable flood inundation mapping because the presence of mixed pixels in the images. To solve the mixed pixel problem advanced image processing techniques are adopted and Linear Spectral unmixing method is one of the most popular soft classification technique used for mixed pixel analysis. The good performance of linear spectral unmixing depends on two important issues, those are, the method of selecting endmembers and the method to model the endmembers for unmixing. This paper presents an improvement in the adaptive selection of endmember subset for each pixel in spectral unmixing method for reliable flood mapping. Using a fixed set of endmembers for spectral unmixing all pixels in an entire image might cause over estimation of the endmember spectra residing in a mixed pixel and hence cause reducing the performance level of spectral unmixing. Compared to this, application of estimated adaptive subset of endmembers for each pixel can decrease the residual error in unmixing results and provide a reliable output. In this current paper, it has also been proved that this proposed method can improve the accuracy of conventional linear unmixing methods and also easy to apply. Three different linear spectral unmixing methods were applied to test the improvement in unmixing results. Experiments were conducted in three different sets of Landsat-5 TM images of three different flood events in Australia to examine the method on different flooding conditions and achieved satisfactory outcomes in flood mapping.
Resumo:
The widespread incidence of enterococci resistant to ampicillin, vancomycin and aminoglycosides, the first-line anti-enterococcal antibiotics, has made the treatment of severe enterococcal infections difficult and alternatives should be explored. We investigated the activity of daptomycin combined with linezolid against three Enterococcus faecalis and four Enterococcus faecium strains resistant to standard drugs used for therapy. Minimum inhibitory concentrations (MICs) were determined by the broth dilution method. Drug interactions were assessed by the checkerboard and time-kill methods. Synergy was defined by a fractional inhibitory concentration index (FICI) of ≤0.5 or a ≥2 log10 CFU/mL killing at 24 h with the combination in comparison with killing by the most active single agent. Indifference was defined by a FICI > 0.5-4.0 or a 1-2 log10 CFU/mL killing compared with the most active single agent. MICs of daptomycin were 2-4 μg/mL for E. faecalis and 2-8 μg/mL for E. faecium. MICs of linezolid were 1-2 μg/mL for all bacteria. In the checkerboard assay, five isolates showed synergism (FICI < 0.5) and two showed indifference (FICIs of 0.53 and 2). Killing studies revealed synergy of daptomycin plus linezolid against four isolates (2.2-3.7 log10 CFU/mL kill) and indifference (1.1-1.6 log10 CFU/mL kill) for the other three strains. Antagonism was not observed. In conclusion, the combination of daptomycin and linezolid had a synergistic or indifferent effect against multidrug-resistant enterococci. Additional studies are needed to explore the potential of this combination for severe enterococcal infections when first-line antibiotic combinations cannot be used.
Resumo:
Panssarialusta ja panssarintorjunta ovat kulkeneet kilpajuoksussa ensimmäisestä maailman-sodasta alkaen. Panssarivaunun suojausta on parannettu lisäämällä panssaroinnin paksuutta ja asentamalla lisäpanssarointia. Uusimpana trendinä on ollut kehittää aktiivisia omasuojajär-jestelmiä jotka pyrkivät estämään panssarintorjunta-aseen vaikuttamisen panssarivaunuun ennen kuin se ehtii osua kohteeseensa. Tutkielmassa selvitetään panssarintorjunta-asetta vastaan vasta-aseella toimivia hard-kill-omasuojajärjestelmiä vastaan toimimista jokaiselle jalkaväen sotilaalle koulutettavilla ker-tasingoilla. Tutkielma selvittää, miltä ampuma-etäisyyksiltä pitää toimia, jotta aktiivinen omasuojajärjestelmä ei ehdi reagoida sitä kohti tulevaan uhkaan. Tutkielma on tyyliltään kvalitatiivinen kirjallisuuskatsaus ja tutkielma on tyyliltään her-meneuttinen analyysi. Tilastollista menetelmää on käytetty verrattaessa aseiden ja aktiivisten omasuojajärjestelmien vuorovaikutusta. Tutkielman keskeisenä johtopäätöksenä on, että osaa järjestelmistä vastaan voidaan toimia niin, etteivät ne ehdi reagoida panssarintorjunta-aseeseen. Järjestelmiä jotka kykenevät toi-mimaan uhkaa vastaan, on käsitelty vaihtoehtoisia vaikuttamismenetelmiä.
Resumo:
Techniques of optimization known as metaheuristics have achieved success in the resolution of many problems classified as NP-Hard. These methods use non deterministic approaches that reach very good solutions which, however, don t guarantee the determination of the global optimum. Beyond the inherent difficulties related to the complexity that characterizes the optimization problems, the metaheuristics still face the dilemma of xploration/exploitation, which consists of choosing between a greedy search and a wider exploration of the solution space. A way to guide such algorithms during the searching of better solutions is supplying them with more knowledge of the problem through the use of a intelligent agent, able to recognize promising regions and also identify when they should diversify the direction of the search. This way, this work proposes the use of Reinforcement Learning technique - Q-learning Algorithm - as exploration/exploitation strategy for the metaheuristics GRASP (Greedy Randomized Adaptive Search Procedure) and Genetic Algorithm. The GRASP metaheuristic uses Q-learning instead of the traditional greedy-random algorithm in the construction phase. This replacement has the purpose of improving the quality of the initial solutions that are used in the local search phase of the GRASP, and also provides for the metaheuristic an adaptive memory mechanism that allows the reuse of good previous decisions and also avoids the repetition of bad decisions. In the Genetic Algorithm, the Q-learning algorithm was used to generate an initial population of high fitness, and after a determined number of generations, where the rate of diversity of the population is less than a certain limit L, it also was applied to supply one of the parents to be used in the genetic crossover operator. Another significant change in the hybrid genetic algorithm is the proposal of a mutually interactive cooperation process between the genetic operators and the Q-learning algorithm. In this interactive/cooperative process, the Q-learning algorithm receives an additional update in the matrix of Q-values based on the current best solution of the Genetic Algorithm. The computational experiments presented in this thesis compares the results obtained with the implementation of traditional versions of GRASP metaheuristic and Genetic Algorithm, with those obtained using the proposed hybrid methods. Both algorithms had been applied successfully to the symmetrical Traveling Salesman Problem, which was modeled as a Markov decision process
Resumo:
Topological optimization problems based on stress criteria are solved using two techniques in this paper. The first technique is the conventional Evolutionary Structural Optimization (ESO), which is known as hard kill, because the material is discretely removed; that is, the elements under low stress that are being inefficiently utilized have their constitutive matrix has suddenly reduced. The second technique, proposed in a previous paper, is a variant of the ESO procedure and is called Smooth ESO (SESO), which is based on the philosophy that if an element is not really necessary for the structure, its contribution to the structural stiffness will gradually diminish until it no longer influences the structure; its removal is thus performed smoothly. This procedure is known as "soft-kill"; that is, not all of the elements removed from the structure using the ESO criterion are discarded. Thus, the elements returned to the structure must provide a good conditioning system that will be resolved in the next iteration, and they are considered important to the optimization process. To evaluate elasticity problems numerically, finite element analysis is applied, but instead of using conventional quadrilateral finite elements, a plane-stress triangular finite element was implemented with high-order modes for solving complex geometric problems. A number of typical examples demonstrate that the proposed approach is effective for solving problems of bi-dimensional elasticity. (C) 2014 Elsevier Ltd. All rights reserved.
Resumo:
The relationship between employer and worker is not only obligatory but above all, as Sinzheimer said, a ‘relationship of power’. In the Digital Age this statement is confirmed by the massive introduction of ICT in most of the companies that increase, in practice, employer’s supervisory powers. This is a worrying issue for two reasons: on one hand, ICT emerge as a new way to weaken the effectiveness of fundamental rights and the right to dignity of workers; and, on the other hand, Spanish legal system does not offer appropriate solutions to ensure that efficacy. Moreover, in a scenario characterized by a hybridization of legal systems models –in which traditional hard law methods are combined with soft law and self regulation instruments–, the role of our case law has become very important in this issue. Nevertheless, despite the increase of judicialization undergone, solutions offered by Courts are so different that do not give enough legal certainty. Facing this situation, I suggest a methodological approach –using Alchourron and Bulygin’s normative systems theory and Alexy’s fundamental rights theory– which can open new spaces of decision to legal operators in order to solve properly these problems. This proposal can allow setting a policy that guarantees fundamental rights of workers, deepening their human freedom in companies from the Esping-Andersen’s de-commodification perspective. With this purpose, I examine electronic communications in the company as a case study.
Resumo:
Thesis (Master's)--University of Washington, 2016-06
Resumo:
The problem of rats in our Hawaiian sugar cane fields has been with us for a long time. Early records tell of heavy damage at various times on all the islands where sugar cane is grown. Many methods were tried to control these rats. Trapping was once used as a control measure, a bounty was used for a time, gangs of dogs were trained to catch the rats as the cane was harvested. Many kinds of baits and poisons were used. All of these methods were of some value as long as labor was cheap. Our present day problem started when the labor costs started up and the sugar industry shifted to long cropping. Until World War II cane was an annual crop. After the war it was shifted to a two year crop, three years in some places. Depending on variety, location, and soil we raise 90 to 130 tons of sugar cane per acre, which produces 7 to 15 tons of sugar per acre for a two year crop. This sugar brings about $135 dollars per ton. This tonnage of cane is a thick tangle of vegetation. The cane grows erect for almost a year, as it continues to grow it bends over at the base. This allows the stalk to rest on the ground or on other stalks of cane as it continues to grow. These stalks form a tangled mat of stalks and dead leaves that may be two feet thick at the time of harvest. At the same time the leafy growing portion of the stalk will be sticking up out of the mat of cane ten feet in the air. Some of these individual stalks may be 30 feet long and still growing at the time of harvest. All this makes it very hard to get through a cane field as it is one long, prolonged stumble over and through the cane. It is in this mat of cane that our three species of rats live. Two species are familiar to most people in the pest control field. Rattus norvegicus and Rattus rattus. In the latter species we include both the black rat and the alexandrine rats, their habits seem to be the same in Hawaii. Our third rat is the Polynesian rat, Rattus exlans, locally called the Hawaiian rat. This is a small rat, the average length head to tip of tail is nine inches and the average body weight is 65 grams. It has dark brownish fur like the alexandrine rats, and a grey belly. It is found in Indonesia, on most of the islands of Oceania and in New Zealand. All three rats live in our cane fields and the brushy and forested portions of our islands. The norway and alexandrine rats are found in and around the villages and farms, the Polynesian rat is only found in the fields and waste areas. The actual amount of damage done by rats is small, but destruction they cause is large. The rats gnaw through the rind of the cane stalk and eat the soft juicy and sweet tissues inside. They will hollow out one to several nodes per stalk attacked. The effect to the cane stalk is like ringing a tree. After this attack the stalk above the chewed portion usually dies, and sometimes the lower portion too. If the rat does not eat through the stalk the cane stalk could go on living and producing sugar at a reduced rate. Generally an injured stalk does not last long. Disease and souring organisms get in the injury and kill the stalk. And if this isn't enough, some insects are attracted to the injured stalk and will sometimes bore in and kill it. An injured stalk of cane doesn't have much of a chance. A rat may only gnaw out six inches of a 30 foot stalk and the whole stalk will die. If the rat only destroyed what he ate we could ignore them but they cause the death of too much cane. This dead, dying, and souring cane cause several direct and indirect tosses. First we lose the sugar that the cane would have produced. We harvest all of our cane mechanically so we haul the dead and souring cane to the mill where we have to grind it with our good cane and the bad cane reduces the purity of the sugar juices we squeeze from the cane. Rats reduce our income and run up our overhead.
Resumo:
The aim of this study was to compare the speech in subjects with cleft lip and palate, in whom three methods of the hard palate closure were used. One hundred and thirty-seven children (96 boys, 41 girls; mean age = 12 years, SD = 1·2) with complete unilateral cleft lip and palate (CUCLP) operated by a single surgeon with a one-stage method were evaluated. The management of the cleft lip and soft palate was comparable in all subjects; for hard palate repair, three different methods were used: bilateral von Langenbeck closure (b-vL group, n = 39), unilateral von Langenbeck closure (u-vL group, n = 56) and vomerplasty (v-p group, n = 42). Speech was assessed: (i) perceptually for the presence of a) hypernasality, b) compensatory articulations (CAs), c) audible nasal air emissions (ANE) and d) speech intelligibility; (ii) for the presence of compensatory facial grimacing, (iii) with clinical intra-oral evaluation and (iv) with videonasendoscopy. A total rate of hypernasality requiring pharyngoplasty was 5·1%; total incidence post-oral compensatory articulations (CAs) was 2·2%. The overall speech intelligibility was good in 84·7% of cases. Oronasal fistulas (ONFs) occurred in 15·7% b-vL subjects, 7·1% u-vL subjects and 50% v-p subjects (P < 0·001). No statistically significant intergroup differences for hypernasality, CAs and intelligibility were found (P > 0·1). In conclusion, the speech after early one-stage repair of CUCLP was satisfactory. The method of hard palate repair affected the incidence of ONFs, which, however, caused relatively mild and inconsistent speech errors.
Resumo:
Now as in earlier periods of acute change in the media environment, new disciplinary articulations are producing new methods for media and communication research. At the same time, established media and communication studies meth- ods are being recombined, reconfigured, and remediated alongside their objects of study. This special issue of JOBEM seeks to explore the conceptual, political, and practical aspects of emerging methods for digital media research. It does so at the conjuncture of a number of important contemporary trends: the rise of a ‘‘third wave’’ of the Digital Humanities and the ‘‘computational turn’’ (Berry, 2011) associated with natively digital objects and the methods for studying them; the apparently ubiquitous Big Data paradigm—with its various manifestations across academia, business, and government — that brings with it a rapidly increasing interest in social media communication and online ‘‘behavior’’ from the ‘‘hard’’ sciences; along with the multisited, embodied, and emplaced nature of everyday digital media practice.
Resumo:
Schweitzer et al. previously published a paper in the Australian and New Zealand Journal of Psychiatry which provided prevalence rates on suicidal ideation and behaviour among university students [1]. We wish to provide an update on extensions of our previously published work. In our previous publication we indicated the relatively high percentage of students who reported suicide-related behaviour over the past 12 months (6.6%). This figure is very similar to a more recent study undertaken in the UK where 6% of student respondents reported suicide attempts [2]. As a follow up, we investigated this finding further in studies undertaken in 1994 and 1997 by asking fresh samples of University of Queensland first-year undergraduates who responded positively to the question ‘I have made attempts to kill myself’ (in the past year), to provide additional data relating to the methods employed in their suicide attempts and the consequences following their suicide attempt in terms of level of injury and medical care received...
Resumo:
An innovative cement-based soft-hard-soft (SHS) multi-layer composite has been developed for protective infrastructures. Such composite consists of three layers including asphalt concrete (AC), high strength concrete (HSC), and engineered cementitious composites (ECC). A three dimensional benchmark numerical model for this SHS composite as pavement under blast load was established using LSDYNA and validated by field blast test. Parametric studies were carried out to investigate the influence of a few key parameters including thickness and strength of HSC and ECC layers, interface properties, soil conditions on the blast resistance of the composite. The outcomes of this study also enabled the establishment of a damage pattern chart for protective pavement design and rapid repair after blast load. Efficient methods to further improve the blast resistance of the SHS multi-layer pavement system were also recommended.