831 resultados para computational complexity


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Excepting the Peripheral and Central Nervous Systems, the Immune System is the most complex of somatic systems in higher animals. This complexity manifests itself at many levels from the molecular to that of the whole organism. Much insight into this confounding complexity can be gained through computational simulation. Such simulations range in application from epitope prediction through to the modelling of vaccination strategies. In this review, we evaluate selectively various key applications relevant to computational vaccinology: these include technique that operates at different scale that is, from molecular to organisms and even to population level.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As the complexity of parallel applications increase, the performance limitations resulting from computational load imbalance become dominant. Mapping the problem space to the processors in a parallel machine in a manner that balances the workload of each processors will typically reduce the run-time. In many cases the computation time required for a given calculation cannot be predetermined even at run-time and so static partition of the problem returns poor performance. For problems in which the computational load across the discretisation is dynamic and inhomogeneous, for example multi-physics problems involving fluid and solid mechanics with phase changes, the workload for a static subdomain will change over the course of a computation and cannot be estimated beforehand. For such applications the mapping of loads to process is required to change dynamically, at run-time in order to maintain reasonable efficiency. The issue of dynamic load balancing are examined in the context of PHYSICA, a three dimensional unstructured mesh multi-physics continuum mechanics computational modelling code.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work is aimed at understanding and unifying information on epidemiological modelling methods and how those methods relate to public policy addressing human health, specifically in the context of infectious disease prevention, pandemic planning, and health behaviour change. This thesis employs multiple qualitative and quantitative methods, and presents as a manuscript of several individual, data-driven projects that are combined in a narrative arc. The first chapter introduces the scope and complexity of this interdisciplinary undertaking, describing several topical intersections of importance. The second chapter begins the presentation of original data, and describes in detail two exercises in computational epidemiological modelling pertinent to pandemic influenza planning and policy, and progresses in the next chapter to present additional original data on how the confidence of the public in modelling methodology may have an effect on their planned health behaviour change as recommended in public health policy. The thesis narrative continues in the final data-driven chapter to describe how health policymakers use modelling methods and scientific evidence to inform and construct health policies for the prevention of infectious diseases, and concludes with a narrative chapter that evaluates the breadth of this data and recommends strategies for the optimal use of modelling methodologies when informing public health policy in applied public health scenarios.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Human and robots have complementary strengths in performing assembly operations. Humans are very good at perception tasks in unstructured environments. They are able to recognize and locate a part from a box of miscellaneous parts. They are also very good at complex manipulation in tight spaces. The sensory characteristics of the humans, motor abilities, knowledge and skills give the humans the ability to react to unexpected situations and resolve problems quickly. In contrast, robots are very good at pick and place operations and highly repeatable in placement tasks. Robots can perform tasks at high speeds and still maintain precision in their operations. Robots can also operate for long periods of times. Robots are also very good at applying high forces and torques. Typically, robots are used in mass production. Small batch and custom production operations predominantly use manual labor. The high labor cost is making it difficult for small and medium manufacturers to remain cost competitive in high wage markets. These manufactures are mainly involved in small batch and custom production. They need to find a way to reduce the labor cost in assembly operations. Purely robotic cells will not be able to provide them the necessary flexibility. Creating hybrid cells where humans and robots can collaborate in close physical proximities is a potential solution. The underlying idea behind such cells is to decompose assembly operations into tasks such that humans and robots can collaborate by performing sub-tasks that are suitable for them. Realizing hybrid cells that enable effective human and robot collaboration is challenging. This dissertation addresses the following three computational issues involved in developing and utilizing hybrid assembly cells: - We should be able to automatically generate plans to operate hybrid assembly cells to ensure efficient cell operation. This requires generating feasible assembly sequences and instructions for robots and human operators, respectively. Automated planning poses the following two challenges. First, generating operation plans for complex assemblies is challenging. The complexity can come due to the combinatorial explosion caused by the size of the assembly or the complex paths needed to perform the assembly. Second, generating feasible plans requires accounting for robot and human motion constraints. The first objective of the dissertation is to develop the underlying computational foundations for automatically generating plans for the operation of hybrid cells. It addresses both assembly complexity and motion constraints issues. - The collaboration between humans and robots in the assembly cell will only be practical if human safety can be ensured during the assembly tasks that require collaboration between humans and robots. The second objective of the dissertation is to evaluate different options for real-time monitoring of the state of human operator with respect to the robot and develop strategies for taking appropriate measures to ensure human safety when the planned move by the robot may compromise the safety of the human operator. In order to be competitive in the market, the developed solution will have to include considerations about cost without significantly compromising quality. - In the envisioned hybrid cell, we will be relying on human operators to bring the part into the cell. If the human operator makes an error in selecting the part or fails to place it correctly, the robot will be unable to correctly perform the task assigned to it. If the error goes undetected, it can lead to a defective product and inefficiencies in the cell operation. The reason for human error can be either confusion due to poor quality instructions or human operator not paying adequate attention to the instructions. In order to ensure smooth and error-free operation of the cell, we will need to monitor the state of the assembly operations in the cell. The third objective of the dissertation is to identify and track parts in the cell and automatically generate instructions for taking corrective actions if a human operator deviates from the selected plan. Potential corrective actions may involve re-planning if it is possible to continue assembly from the current state. Corrective actions may also involve issuing warning and generating instructions to undo the current task.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Self-replication and compartmentalization are two central properties thought to be essential for minimal life, and understanding how such processes interact in the emergence of complex reaction networks is crucial to exploring the development of complexity in chemistry and biology. Autocatalysis can emerge from multiple different mechanisms such as formation of an initiator, template self-replication and physical autocatalysis (where micelles formed from the reaction product solubilize the reactants, leading to higher local concentrations and therefore higher rates). Amphiphiles are also used in artificial life studies to create protocell models such as micelles, vesicles and oil-in-water droplets, and can increase reaction rates by encapsulation of reactants. So far, no template self-replicator exists which is capable of compartmentalization, or transferring this molecular scale phenomenon to micro or macro-scale assemblies. Here a system is demonstrated where an amphiphilic imine catalyses its own formation by joining a non-polar alkyl tail group with a polar carboxylic acid head group to form a template, which was shown to form reverse micelles by Dynamic Light Scattering (DLS). The kinetics of this system were investigated by 1H NMR spectroscopy, showing clearly that a template self-replication mechanism operates, though there was no evidence that the reverse micelles participated in physical autocatalysis. Active oil droplets, composed from a mixture of insoluble organic compounds in an aqueous sub-phase, can undergo processes such as division, self-propulsion and chemotaxis, and are studied as models for minimal cells, or protocells. Although in most cases the Marangoni effect is responsible for the forces on the droplet, the behaviour of the droplet depends heavily on the exact composition. Though theoretical models are able to calculate the forces on a droplet, to model a mixture of oils on an aqueous surface where compounds from the oil phase are dissolving and diffusing through the aqueous phase is beyond current computational capability. The behaviour of a droplet in an aqueous phase can only be discovered through experiment, though it is determined by the droplet's composition. By using an evolutionary algorithm and a liquid handling robot to conduct droplet experiments and decide which compositions to test next, entirely autonomously, the composition of the droplet becomes a chemical genome capable of evolution. The selection is carried out according to a fitness function, which ranks the formulation based on how well it conforms to the chosen fitness criteria (e.g. movement or division). Over successive generations, significant increases in fitness are achieved, and this increase is higher with more components (i.e. greater complexity). Other chemical processes such as chemiluminescence and gelation were investigated in active oil droplets, demonstrating the possibility of controlling chemical reactions by selective droplet fusion. Potential future applications for this might include combinatorial chemistry, or additional fitness goals for the genetic algorithm. Combining the self-replication and the droplet protocells research, it was demonstrated that the presence of the amphiphilic replicator lowers the interfacial tension between droplets of a reaction mixture in organic solution and the alkaline aqueous phase, causing them to divide. Periodic sampling by a liquid handling robot revealed that the extent of droplet fission increased as the reaction progressed, producing more individual protocells with increased self-replication. This demonstrates coupling of the molecular scale phenomenon of template self-replication to a macroscale physicochemical effect.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Motivated by a recently proposed biologically inspired face recognition approach, we investigated the relation between human behavior and a computational model based on Fourier-Bessel (FB) spatial patterns. We measured human recognition performance of FB filtered face images using an 8-alternative forced-choice method. Test stimuli were generated by converting the images from the spatial to the FB domain, filtering the resulting coefficients with a band-pass filter, and finally taking the inverse FB transformation of the filtered coefficients. The performance of the computational models was tested using a simulation of the psychophysical experiment. In the FB model, face images were first filtered by simulated V1- type neurons and later analyzed globally for their content of FB components. In general, there was a higher human contrast sensitivity to radially than to angularly filtered images, but both functions peaked at the 11.3-16 frequency interval. The FB-based model presented similar behavior with regard to peak position and relative sensitivity, but had a wider frequency band width and a narrower response range. The response pattern of two alternative models, based on local FB analysis and on raw luminance, strongly diverged from the human behavior patterns. These results suggest that human performance can be constrained by the type of information conveyed by polar patterns, and consequently that humans might use FB-like spatial patterns in face processing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We evaluated the performance of a novel procedure for segmenting mammograms and detecting clustered microcalcifications in two types of image sets obtained from digitization of mammograms using either a laser scanner, or a conventional ""optical"" scanner. Specific regions forming the digital mammograms were identified and selected, in which clustered microcalcifications appeared or not. A remarkable increase in image intensity was noticed in the images from the optical scanner compared with the original mammograms. A procedure based on a polynomial correction was developed to compensate the changes in the characteristic curves from the scanners, relative to the curves from the films. The processing scheme was applied to both sets, before and after the polynomial correction. The results indicated clearly the influence of the mammogram digitization on the performance of processing schemes intended to detect microcalcifications. The image processing techniques applied to mammograms digitized by both scanners, without the polynomial intensity correction, resulted in a better sensibility in detecting microcalcifications in the images from the laser scanner. However, when the polynomial correction was applied to the images from the optical scanner, no differences in performance were observed for both types of images. (C) 2008 SPIE and IS&T [DOI: 10.1117/1.3013544]

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Chemical reactivity, photolability, and computational studies of the ruthenium nitrosyl complex with a substituted cyclam, fac-[Ru(NO)Cl(2)(kappa(3)N(4),N(8),N(11)(1-carboxypropyl)cyclam)]Cl center dot H(2)O ((1-carboxypropyl) cyclam = 3-(1,4,8,11-tetraazacyclotetradecan-1-yl) propionic acid)), (I) are described. Chloride ligands do not undergo aquation reactions (at 25 degrees C, pH 3). The rate of nitric oxide (NO) dissociation (k(obs-NO)) upon reduction of I is 2.8 s(-1) at 25 +/- 1 degrees C (in 0.5 mol L(-1) HCl), which is close to the highest value found for related complexes. The uncoordinated carboxyl of I has a pK(a) of similar to 3.3, which is close to that of the carboxyl of the non coordinated (1-carboxypropyl) cyclam (pK(a) = 3.4). Two additional pK(a) values were found for I at similar to 8.0 and similar to 11.5. Upon electrochemical reduction or under irradiation with light (lambda(irr) = 350 or 520 nm; pH 7.4), I releases NO in aqueous solution. The cyclam ring N bound to the carboxypropyl group is not coordinated, resulting in a fac configuration that affects the properties and chemical reactivities of I, especially as NO donor, compared with analogous trans complexes. Among the computational models tested, the B3LYP/ECP28MDF, cc-pVDZ resulted in smaller errors for the geometry of I. The computational data helped clarify the experimental acid-base equilibria and indicated the most favourable site for the second deprotonation, which follows that of the carboxyl group. Furthermore, it showed that by changing the pH it is possible to modulate the electron density of I with deprotonation. The calculated NO bond length and the Ru/NO charge ratio indicated that the predominant canonical structure is [Ru(III)NO], but the Ru-NO bond angles and bond index (b.i.) values were less clear; the angles suggested that [Ru(II)NO(+)] could contribute to the electronic structure of I and b.i. values indicated a contribution from [Ru(IV)NO(-)]. Considering that some experimental data are consistent with a [Ru(II)NO(+)] description, while others are in agreement with [Ru(III)NO], the best description for I would be a linear combination of the three canonical forms, with a higher weight for [Ru(II)NO(+)] and [Ru(III)NO].

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: High level piano performance requires complex integration of perceptual, motor, cognitive and emotive skills. Observations in psychology and neuroscience studies have suggested reciprocal inhibitory modulation of the cognition by emotion and emotion by cognition. However, it is still unclear how cognitive states may influence the pianistic performance. The aim of the present study is to verify the influence of cognitive and affective attention in the piano performances. Methods and Findings: Nine pianists were instructed to play the same piece of music, firstly focusing only on cognitive aspects of musical structure (cognitive performances), and secondly, paying attention solely on affective aspects (affective performances). Audio files from pianistic performances were examined using a computational model that retrieves nine specific musical features (descriptors) - loudness, articulation, brightness, harmonic complexity, event detection, key clarity, mode detection, pulse clarity and repetition. In addition, the number of volunteers' errors in the recording sessions was counted. Comments from pianists about their thoughts during performances were also evaluated. The analyses of audio files throughout musical descriptors indicated that the affective performances have more: agogics, legatos, pianos phrasing, and less perception of event density when compared to the cognitive ones. Error analysis demonstrated that volunteers misplayed more left hand notes in the cognitive performances than in the affective ones. Volunteers also played more wrong notes in affective than in cognitive performances. These results correspond to the volunteers' comments that in the affective performances, the cognitive aspects of piano execution are inhibited, whereas in the cognitive performances, the expressiveness is inhibited. Conclusions: Therefore, the present results indicate that attention to the emotional aspects of performance enhances expressiveness, but constrains cognitive and motor skills in the piano execution. In contrast, attention to the cognitive aspects may constrain the expressivity and automatism of piano performances.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

During the early Holocene two main paleoamerican cultures thrived in Brazil: the Tradicao Nordeste in the semi-desertic Sertao and the Tradicao Itaparica in the high plains of the Planalto Central. Here we report on paleodietary singals of a Paleoamerican found in a third Brazilian ecological setting - a riverine shellmound, or sambaqui, located in the Atlantic forest. Most sambaquis are found along the coast. The peoples associated with them subsisted on marine resources. We are reporting a different situation from the oldest recorded riverine sambaqui, called Capelinha. Capelinha is a relatively small sambaqui established along a river 60 km from the Atlantic Ocean coast. It contained the well-preserved remains of a Paleoamerican known as Luzio dated to 9,945 +/- 235 years ago; the oldest sambaqui dweller so far. Luzio's bones were remarkably well preserved and allowed for stable isotopic analysis of diet. Although artifacts found at this riverine site show connections with the Atlantic coast, we show that he represents a population that was dependent on inland resources as opposed to marine coastal resources. After comparing Luzio's paleodietary data with that of other extant and prehistoric groups, we discuss where his group could have come from, if terrestrial diet persisted in riverine sambaquis and how Luzio fits within the discussion of the replacement of paleamerican by amerindian morphology. This study adds to the evidence that shows a greater complexity in the prehistory of the colonization of and the adaptations to the New World.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: The inference of gene regulatory networks (GRNs) from large-scale expression profiles is one of the most challenging problems of Systems Biology nowadays. Many techniques and models have been proposed for this task. However, it is not generally possible to recover the original topology with great accuracy, mainly due to the short time series data in face of the high complexity of the networks and the intrinsic noise of the expression measurements. In order to improve the accuracy of GRNs inference methods based on entropy (mutual information), a new criterion function is here proposed. Results: In this paper we introduce the use of generalized entropy proposed by Tsallis, for the inference of GRNs from time series expression profiles. The inference process is based on a feature selection approach and the conditional entropy is applied as criterion function. In order to assess the proposed methodology, the algorithm is applied to recover the network topology from temporal expressions generated by an artificial gene network (AGN) model as well as from the DREAM challenge. The adopted AGN is based on theoretical models of complex networks and its gene transference function is obtained from random drawing on the set of possible Boolean functions, thus creating its dynamics. On the other hand, DREAM time series data presents variation of network size and its topologies are based on real networks. The dynamics are generated by continuous differential equations with noise and perturbation. By adopting both data sources, it is possible to estimate the average quality of the inference with respect to different network topologies, transfer functions and network sizes. Conclusions: A remarkable improvement of accuracy was observed in the experimental results by reducing the number of false connections in the inferred topology by the non-Shannon entropy. The obtained best free parameter of the Tsallis entropy was on average in the range 2.5 <= q <= 3.5 (hence, subextensive entropy), which opens new perspectives for GRNs inference methods based on information theory and for investigation of the nonextensivity of such networks. The inference algorithm and criterion function proposed here were implemented and included in the DimReduction software, which is freely available at http://sourceforge.net/projects/dimreduction and http://code.google.com/p/dimreduction/.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We evaluated the reliability and validity of a Brazilian-Portuguese version of the Epilepsy Medication Treatment Complexity Index (EMTCI). Interrater reliability was evaluated with the intraclass correlation coefficient (ICC), and validity was evaluated by correlation of mean EMTCI scores with the following variables: number of antiepileptic drugs (AEDs), seizure control, patients` perception of seizure control, and adherence to the therapeutic regimen as measured with the Morisky scale. We studied patients with epilepsy followed in a tertiary university-based hospital outpatient clinic setting, aged 18 years or older, independent in daily living activities, and without cognitive impairment or active psychiatric disease. ICCs ranged from 0.721 to 0.999. Mean EMTCI scores were significantly correlated with the variables assessed. Higher EMTCI scores were associated with an increasing number of AEDs, uncontrolled seizures, patients` perception of lack of seizure control, and poorer adherence to the therapeutic regimen. The results indicate that the Brazilian-Portuguese EMTCI is reliable and valid to be applied clinically in the country. The Brazilian-Portuguese EMTCI version may be a useful tool in developing strategies to minimize treatment complexity, possibly improving seizure control and quality of life in people with epilepsy in our milieu. (C) 2011 Elsevier Inc. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aging is known to have a degrading influence on many structures and functions of the human sensorimotor system. The present work assessed aging-related changes in postural sway using fractal and complexity measures of the center of pressure (COP) dynamics with the hypothesis that complexity and fractality decreases in the older individuals. Older subjects (68 +/- 4 years) and young adult subjects (28 +/- 7 years) performed a quiet stance task (60 s) and a prolonged standing task (30 min) where subjects were allowed to move freely. Long-range correlations (fractality) of the data were estimated by the detrended fluctuation analysis (DFA); changes in entropy were estimated by the multi-scale entropy (MSE) measure. The DFA results showed that the fractal dimension was lower for the older subjects in comparison to the young adults but the fractal dimensions of both groups were not different from a 1/f noise, for time intervals between 10 and 600 s. The MSE analysis performed with the typically applied adjustment to the criterion distance showed a higher degree of complexity in the older subjects, which is inconsistent with the hypothesis that complexity in the human physiological system decreases with aging. The same MSE analysis performed without adjustment showed no differences between the groups. Taken all results together, the decrease in total postural sway and long-range correlations in older individuals are signs of an adaptation process reflecting the diminishing ability to generate adequate responses on a longer time scale.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this study was to investigate the effects of knowledge of results (KR) frequency and task complexity on motor skill acquisition. The task consisted of throwing a bocha ball to place it as close as possible to the target ball. 120 students ages 11 to 73 years were assigned to one of eight experimental groups according to knowledge of results frequency (25, 50, 75, and 100%) and task complexity (simple and complex). Subjects performed 90 trials in the acquisition phase and 10 trials in the transfer test. The results showed that knowledge of results given at a frequency of 25% resulted in an inferior absolute error than 50% and inferior variable error than 50, 75, and 100 I frequencies, but no effect of task complexity was found.