899 resultados para the SIMPLE algorithm
Resumo:
En option är ett finansiellt kontrakt som ger dess innehavare en rättighet (men medför ingen skyldighet) att sälja eller köpa någonting (till exempel en aktie) till eller från säljaren av optionen till ett visst pris vid en bestämd tidpunkt i framtiden. Den som säljer optionen binder sig till att gå med på denna framtida transaktion ifall optionsinnehavaren längre fram bestämmer sig för att inlösa optionen. Säljaren av optionen åtar sig alltså en risk av att den framtida transaktion som optionsinnehavaren kan tvinga honom att göra visar sig vara ofördelaktig för honom. Frågan om hur säljaren kan skydda sig mot denna risk leder till intressanta optimeringsproblem, där målet är att hitta en optimal skyddsstrategi under vissa givna villkor. Sådana optimeringsproblem har studerats mycket inom finansiell matematik. Avhandlingen "The knapsack problem approach in solving partial hedging problems of options" inför en ytterligare synpunkt till denna diskussion: I en relativt enkel (ändlig och komplett) marknadsmodell kan nämligen vissa partiella skyddsproblem beskrivas som så kallade kappsäcksproblem. De sistnämnda är välkända inom en gren av matematik som heter operationsanalys. I avhandlingen visas hur skyddsproblem som tidigare lösts på andra sätt kan alternativt lösas med hjälp av metoder som utvecklats för kappsäcksproblem. Förfarandet tillämpas även på helt nya skyddsproblem i samband med så kallade amerikanska optioner.
Resumo:
In less than twenty years, what began as a concept for the treatment of exsanguinating truncal trauma patients has become the primary treatment model for numerous emergent, life threatening surgical conditions incapable of tolerating traditional methods. Its core concepts are relative straightforward and simple in nature: first, proper identification of the patient who is in need of following this paradigm; second, truncation of the initial surgical procedure to the minimal necessary operation; third, aggressive, focused resuscitation in the intensive care unit; fourth, definitive care only once the patient is optimized to tolerate the procedure. These simple underlying principles can be molded to a variety of emergencies, from its original application in combined major vascular and visceral trauma to the septic abdomen and orthopedics. A host of new resuscitation strategies and technologies have been developed over the past two decades, from permissive hypotension and damage control resuscitation to advanced ventilators and hemostatic agents, which have allowed for a more focused resuscitation, allowing some of the morbidity of this model to be reduced. The combination of the simple, malleable paradigm along with better understanding of resuscitation has proven to be a potent blend. As such, what was once an almost lethal injury (combined vascular and visceral injury) has become a survivable one.
Resumo:
By coupling the Boundary Element Method (BEM) and the Finite Element Method (FEM) an algorithm that combines the advantages of both numerical processes is developed. The main aim of the work concerns the time domain analysis of general three-dimensional wave propagation problems in elastic media. In addition, mathematical and numerical aspects of the related BE-, FE- and BE/FE-formulations are discussed. The coupling algorithm allows investigations of elastodynamic problems with a BE- and a FE-subdomain. In order to observe the performance of the coupling algorithm two problems are solved and their results compared to other numerical solutions.
Resumo:
The amount of biological data has grown exponentially in recent decades. Modern biotechnologies, such as microarrays and next-generation sequencing, are capable to produce massive amounts of biomedical data in a single experiment. As the amount of the data is rapidly growing there is an urgent need for reliable computational methods for analyzing and visualizing it. This thesis addresses this need by studying how to efficiently and reliably analyze and visualize high-dimensional data, especially that obtained from gene expression microarray experiments. First, we will study the ways to improve the quality of microarray data by replacing (imputing) the missing data entries with the estimated values for these entries. Missing value imputation is a method which is commonly used to make the original incomplete data complete, thus making it easier to be analyzed with statistical and computational methods. Our novel approach was to use curated external biological information as a guide for the missing value imputation. Secondly, we studied the effect of missing value imputation on the downstream data analysis methods like clustering. We compared multiple recent imputation algorithms against 8 publicly available microarray data sets. It was observed that the missing value imputation indeed is a rational way to improve the quality of biological data. The research revealed differences between the clustering results obtained with different imputation methods. On most data sets, the simple and fast k-NN imputation was good enough, but there were also needs for more advanced imputation methods, such as Bayesian Principal Component Algorithm (BPCA). Finally, we studied the visualization of biological network data. Biological interaction networks are examples of the outcome of multiple biological experiments such as using the gene microarray techniques. Such networks are typically very large and highly connected, thus there is a need for fast algorithms for producing visually pleasant layouts. A computationally efficient way to produce layouts of large biological interaction networks was developed. The algorithm uses multilevel optimization within the regular force directed graph layout algorithm.
Resumo:
It is well known that saccadic reaction times (SRT) are reduced when the target is preceded by the offset of the fixation point (FP) - the gap effect. Some authors have proposed that the FP offset also allows the saccadic system to generate a separate population of SRT, the express saccades. Nevertheless, there is no agreement as to whether the gap effect and express responses are also present for manual reaction times (MRT). We tested the gap effect and the MRT distribution in two different conditions, i.e., simple and choice MRT. In the choice MRT condition, subjects need to identify the side of the stimulus and to select the appropriate response, while in the simple MRT these stages are not necessary. We report that the gap effect was present in both conditions (22 ms for choice MRT condition; 15 ms for simple MRT condition), but, when analyzing the MRT distributions, we did not find any clear evidence for express manual responses. The main difference in MRT distribution between simple and choice conditions was a shift towards shorter values for simple MRT.
Resumo:
The early facilitatory effect of a peripheral spatially visual prime stimulus described in the literature for simple reaction time tasks has been usually smaller than that described for complex (go/no-go, choice) reaction time tasks. In the present study we investigated the reason for this difference. In a first and a second experiment we tested the participants in both a simple task and a go/no-go task, half of them beginning with one of these tasks and half with the other one. We observed that the prime stimulus had an early effect, inhibitory for the simple task and facilitatory for the go/no-go task, when the task was performed first. No early effect appeared when the task was performed second. In a third and a fourth experiment the participants were, respectively, tested in the simple task and in the go/no-go task for four sessions (the prime stimulus was presented in the second, third and fourth sessions). The early effects of the prime stimulus did not change across the sessions, suggesting that a habituatory process was not the cause for the disappearance of these effects in the first two experiments. Our findings are compatible with the idea that different attentional strategies are adopted in simple and complex reaction time tasks. In the former tasks the gain of automatic attention mechanisms may be adjusted to a low level and in the latter tasks, to a high level. The attentional influence of the prime stimulus may be antagonized by another influence, possibly a masking one.
Resumo:
Tämä työ vastaa tarpeeseen hallita korkeapainevesisumusuuttimen laatua virtausmekaniikan työkalujen avulla. Työssä tutkitaan suutinten testidatan lisäksi virtauksen käyttäytymistä suuttimen sisällä CFD-laskennan avulla. Virtausmallinnus tehdään Navier-Stokes –pohjaisella laskentamenetelmällä. Työn teoriaosassa käsitellään virtaustekniikkaa ja sen kehitystä yleisesti. Lisäksi esitetään suuttimen laskennassa käytettävää perusteoriaa sekä teknisiä ratkaisuja. Teoriaosassa käydään myös läpi laskennalliseen virtausmekaniikkaan (CFD-laskenta) liittyvää perusteoriaa. Tutkimusosiossa esitetään käsitellyt suutintestitulokset sekä mallinnetaan suutinvirtausta ajasta riippumattomaan virtauslaskentaan perustuvalla laskentamenetelmällä. Virtauslaskennassa käytetään OpenFOAM-laskentaohjelmiston SIMPLE-virtausratkaisijaa sekä k-omega SST –turbulenssimallia. Tehtiin virtausmallinnus kaikilla paineilla, joita suuttimen testauksessa myös todellisuudessa käytetään. Lisäksi selvitettiin mahdolliset kavitaatiokohdat suuttimessa ja suunniteltiin kavitaatiota ehkäisevä suutingeometria. Todettiin myös lämpötilan ja epäpuhtauksien vaikuttavan kavitaatioon sekä mallinnettiin lämpötilan vaikutusta. Luotiin malli, jolla suuttimen suunnitteluun liittyviin haasteisiin voidaan vastata numeerisella laskennalla.
Resumo:
An auditory stimulus speeds up a digital response to a subsequent visual stimulus. This facilitatory effect has been related to the expectancy and the immediate arousal that would be caused by the accessory stimulus. The present study examined the relative contribution of these two influences. In a first and a third experiment a simple reaction time task was used. In a second and fourth experiment a go/no-go reaction time task was used. In each of these experiments, the accessory stimulus preceded the target stimulus by 200 ms for one group of male and female volunteers (G Fix). For another group of similar volunteers (G Var) the accessory stimulus preceded the target stimulus by 200 ms in 25% of the trials, by 1000 ms in 25% of the trials and was not followed by the target stimulus in 50% of the trials (Experiments 1a and 1b) or preceded the target stimulus by 200 ms in 6% of the trials and by 1000 ms in 94% of the trials (Experiments 2a and 2b). There was a facilitatory effect of the accessory stimulus for G Fix in the four experiments. There was also a facilitatory effect of the accessory stimulus at the 200-ms stimulus onset asynchrony for G Var in Experiments 1a and 1b but not in Experiments 2a and 2b. The facilitatory effects observed were larger in the go/no-go task than in the simple task. Taken together, these results suggest that expectancy is much more important than immediate arousal for the improvement of performance caused by an accessory stimulus.
Resumo:
We compared the cost-benefit of two algorithms, recently proposed by the Centers for Disease Control and Prevention, USA, with the conventional one, the most appropriate for the diagnosis of hepatitis C virus (HCV) infection in the Brazilian population. Serum samples were obtained from 517 ELISA-positive or -inconclusive blood donors who had returned to Fundação Pró-Sangue/Hemocentro de São Paulo to confirm previous results. Algorithm A was based on signal-to-cut-off (s/co) ratio of ELISA anti-HCV samples that show s/co ratio ³95% concordance with immunoblot (IB) positivity. For algorithm B, reflex nucleic acid amplification testing by PCR was required for ELISA-positive or -inconclusive samples and IB for PCR-negative samples. For algorithm C, all positive or inconclusive ELISA samples were submitted to IB. We observed a similar rate of positive results with the three algorithms: 287, 287, and 285 for A, B, and C, respectively, and 283 were concordant with one another. Indeterminate results from algorithms A and C were elucidated by PCR (expanded algorithm) which detected two more positive samples. The estimated cost of algorithms A and B was US$21,299.39 and US$32,397.40, respectively, which were 43.5 and 14.0% more economic than C (US$37,673.79). The cost can vary according to the technique used. We conclude that both algorithms A and B are suitable for diagnosing HCV infection in the Brazilian population. Furthermore, algorithm A is the more practical and economical one since it requires supplemental tests for only 54% of the samples. Algorithm B provides early information about the presence of viremia.
Resumo:
In the present study, we modeled a reaching task as a two-link mechanism. The upper arm and forearm motion trajectories during vertical arm movements were estimated from the measured angular accelerations with dual-axis accelerometers. A data set of reaching synergies from able-bodied individuals was used to train a radial basis function artificial neural network with upper arm/forearm tangential angular accelerations. The trained radial basis function artificial neural network for the specific movements predicted forearm motion from new upper arm trajectories with high correlation (mean, 0.9149-0.941). For all other movements, prediction was low (range, 0.0316-0.8302). Results suggest that the proposed algorithm is successful in generalization over similar motions and subjects. Such networks may be used as a high-level controller that could predict forearm kinematics from voluntary movements of the upper arm. This methodology is suitable for restoring the upper limb functions of individuals with motor disabilities of the forearm, but not of the upper arm. The developed control paradigm is applicable to upper-limb orthotic systems employing functional electrical stimulation. The proposed approach is of great significance particularly for humans with spinal cord injuries in a free-living environment. The implication of a measurement system with dual-axis accelerometers, developed for this study, is further seen in the evaluation of movement during the course of rehabilitation. For this purpose, training-related changes in synergies apparent from movement kinematics during rehabilitation would characterize the extent and the course of recovery. As such, a simple system using this methodology is of particular importance for stroke patients. The results underlie the important issue of upper-limb coordination.
Resumo:
The objective of the present study was to compare the effect of acute exercise performed at different intensities in relation to the anaerobic threshold (AT) on abilities requiring control of executive functions or alertness in physically active elderly females. Forty-eight physically active elderly females (63.8 ± 4.6 years old) were assigned to one of four groups by drawing lots: control group without exercise or trial groups with exercise performed at 60, 90, or 110% of AT (watts) and submitted to 5 cognitive tests before and after exercise. Following cognitive pretesting, an incremental cycle ergometer test was conducted to determine AT using a fixed blood lactate concentration of 3.5 mmol/L as cutoff. Acute exercise executed at 90% of AT resulted in significant (P < 0.05, ANOVA) improvement in the performance of executive functions when compared to control in 3 of 5 tests (verbal fluency, Tower of Hanoi test (number of movements), and Trail Making test B). Exercising at 60% of AT did not improve results of any tests for executive functions, whereas exercise executed at 110% of AT only improved the performance in one of these tests (verbal fluency) compared to control. Women from all trial groups exhibited a remarkable reduction in the Simple Response Time (alertness) test (P = 0.001). Thus, physical exercise performed close to AT is more effective to improve cognitive processing of older women even if conducted acutely, and using a customized exercise prescription based on the anaerobic threshold should optimize the beneficial effects.
Resumo:
This study investigated the influence of cueing on the performance of untrained and trained complex motor responses. Healthy adults responded to a visual target by performing four sequential movements (complex response) or a single movement (simple response) of their middle finger. A visual cue preceded the target by an interval of 300, 1000, or 2000 ms. In Experiment 1, the complex and simple responses were not previously trained. During the testing session, the complex response pattern varied on a trial-by-trial basis following the indication provided by the visual cue. In Experiment 2, the complex response and the simple response were extensively trained beforehand. During the testing session, the trained complex response pattern was performed in all trials. The latency of the untrained and trained complex responses decreased from the short to the medium and long cue-target intervals. The latency of the complex response was longer than that of the simple response, except in the case of the trained responses and the long cue-target interval. These results suggest that the preparation of untrained complex responses cannot be completed in advance, this being possible, however, for trained complex responses when enough time is available. The duration of the 1st submovement, 1st pause and 2nd submovement of the untrained and the trained complex responses increased from the short to the long cue-target interval, suggesting that there is an increase of online programming of the response possibly related to the degree of certainty about the moment of target appearance.
Resumo:
Our objective is to develop a diffusion Monte Carlo (DMC) algorithm to estimate the exact expectation values, ($o|^|^o), of multiplicative operators, such as polarizabilities and high-order hyperpolarizabilities, for isolated atoms and molecules. The existing forward-walking pure diffusion Monte Carlo (FW-PDMC) algorithm which attempts this has a serious bias. On the other hand, the DMC algorithm with minimal stochastic reconfiguration provides unbiased estimates of the energies, but the expectation values ($o|^|^) are contaminated by ^, an user specified, approximate wave function, when A does not commute with the Hamiltonian. We modified the latter algorithm to obtain the exact expectation values for these operators, while at the same time eliminating the bias. To compare the efficiency of FW-PDMC and the modified DMC algorithms we calculated simple properties of the H atom, such as various functions of coordinates and polarizabilities. Using three non-exact wave functions, one of moderate quality and the others very crude, in each case the results are within statistical error of the exact values.
Resumo:
In order to fully understand an organism's behaviours the interactions between multiple enemies or selective pressures need to be considered, as these interactions are usually far more complex than the simple addition of their effects in isolation. In this thesis, I consider the impact of multiple enemies (fish predators and parasites) on the behaviour of three larval anurans (Lithobates sylvaticus, L. clamitans and L. catesbeianus). I also determine whether species that differ in life-histories and habitat preferences possess different antipredator mechanisms and how this affects species responses to multiple enemies. I show that the three Ranid larvae respond differently to the trade-off imposed by the presence of both fish predators and trematode parasites within the environment. The two more permanent pond breeders (L. clamitans and L. catesbeianus) increased activity when in the combined presence of predators and parasites. In contrast, the temporary pond breeder (L. sylvaticus) decreased activity in the combined presence of predator and parasites, in the same manner as they responded to fish alone. Further, the presence of fish along with parasites increased the susceptibility of both L. sylvaticus and L. clamitans to trematode infection, whereas parasite infection in L. catesbeianus was unaffected by the presence of fish. A second experiment to assess palatability of the three anuran species to fish, revealed a range of palatabilities, with L. catesbeianus being least palatable, L. clamitans being somewhat unpalatable, and L. sylvaticus being highly palatable. This result helps to explain the species differences in tthe observed behaviour to the combined presence of fish and parasites. In conclusion, the results from this study highlight the importance of considering multiple selective pressures faced by organisms and how this shapes their behaviour.
Resumo:
In this thesis we are going to analyze the dictionary graphs and some other kinds of graphs using the PagerRank algorithm. We calculated the correlation between the degree and PageRank of all nodes for a graph obtained from Merriam-Webster dictionary, a French dictionary and WordNet hypernym and synonym dictionaries. Our conclusion was that PageRank can be a good tool to compare the quality of dictionaries. We studied some artificial social and random graphs. We found that when we omitted some random nodes from each of the graphs, we have not noticed any significant changes in the ranking of the nodes according to their PageRank. We also discovered that some social graphs selected for our study were less resistant to the changes of PageRank.