976 resultados para Perturb and observe


Relevância:

90.00% 90.00%

Publicador:

Resumo:

The extent to which North Atlantic Holocene climatic perturbations influenced past human societies is an area of considerable uncertainty and fierce debate. Ireland is ideally placed to help resolve this issue, being occupied for over 9000 yr and located on the eastern Atlantic seaboard, a region dominated by westerly airflow. Irish bog and lake tree populations provide unambiguous evidence of major shifts in surface moisture through the Holocene similar to cycles recorded in the marine realm of the North Atlantic, indicating significant changes in the latitude and intensity of zonal atmospheric circulation across the region. To test for human response to these cycles we summed the probabilities of 465 radiocarbon ages obtained from Irish archaeological contexts and observe enhanced archaeological visibility during periods of sustained wet conditions. These results suggest either increasing density of human populations in key, often defensive locations, and/or the development of subsistence strategies to overcome changing conditions, the latter recently proposed as a significant factor in avoiding societal collapse. Regardless, we demonstrate environmental change is a significantly more important factor in influencing human activity in the landscape than has hitherto been acknowledged.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

To understand academic performance of students, the variable of conscientiousness from personality inventory Big Five, has been recognized as an important key. The aim of this paper is to analyze the relationship established between the personality factor conscientiousness itself and two of its facets, laboriousness and planning, with academic performance, and observe if there are genre differences in consciousness personality factor. A total of 456 Spanish students of high school and college participated in the study. They were requested to answer a personality report and a self inform questionnaire. The results show that both conscientiousness as a personality dimension and the consideration of laboriousness facet are able to predict academic performance, especially with regard to student’s exam marks, classroom attendance and dedication to study. The genre variable pointed out that feminine genre is more conscious than male in that personality factor. From a practical perspective, these results indicate that the establishment of a routine of continuous work is suitable for improving student grades and their adaptation to the educational environment.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Catastrophic events, such as wars and terrorist attacks, tornadoes and hurricanes, earthquakes, tsunamis, floods and landslides, are always accompanied by a large number of casualties. The size distribution of these casualties has separately been shown to follow approximate power law (PL) distributions. In this paper, we analyze the statistical distributions of the number of victims of catastrophic phenomena, in particular, terrorism, and find double PL behavior. This means that the data sets are better approximated by two PLs instead of a single one. We plot the PL parameters, corresponding to several events, and observe an interesting pattern in the charts, where the lines that connect each pair of points defining the double PLs are almost parallel to each other. A complementary data analysis is performed by means of the computation of the entropy. The results reveal relationships hidden in the data that may trigger a future comprehensive explanation of this type of phenomena.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Advances in technology have produced more and more intricate industrial systems, such as nuclear power plants, chemical centers and petroleum platforms. Such complex plants exhibit multiple interactions among smaller units and human operators, rising potentially disastrous failure, which can propagate across subsystem boundaries. This paper analyzes industrial accident data-series in the perspective of statistical physics and dynamical systems. Global data is collected from the Emergency Events Database (EM-DAT) during the time period from year 1903 up to 2012. The statistical distributions of the number of fatalities caused by industrial accidents reveal Power Law (PL) behavior. We analyze the evolution of the PL parameters over time and observe a remarkable increment in the PL exponent during the last years. PL behavior allows prediction by extrapolation over a wide range of scales. In a complementary line of thought, we compare the data using appropriate indices and use different visualization techniques to correlate and to extract relationships among industrial accident events. This study contributes to better understand the complexity of modern industrial accidents and their ruling principles.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

To ensure quality of machined products at minimum machining costs and maximum machining effectiveness, it is very important to select optimum parameters when metal cutting machine tools are employed. Traditionally, the experience of the operator plays a major role in the selection of optimum metal cutting conditions. However, attaining optimum values each time by even a skilled operator is difficult. The non-linear nature of the machining process has compelled engineers to search for more effective methods to attain optimization. The design objective preceding most engineering design activities is simply to minimize the cost of production or to maximize the production efficiency. The main aim of research work reported here is to build robust optimization algorithms by exploiting ideas that nature has to offer from its backyard and using it to solve real world optimization problems in manufacturing processes.In this thesis, after conducting an exhaustive literature review, several optimization techniques used in various manufacturing processes have been identified. The selection of optimal cutting parameters, like depth of cut, feed and speed is a very important issue for every machining process. Experiments have been designed using Taguchi technique and dry turning of SS420 has been performed on Kirlosker turn master 35 lathe. Analysis using S/N and ANOVA were performed to find the optimum level and percentage of contribution of each parameter. By using S/N analysis the optimum machining parameters from the experimentation is obtained.Optimization algorithms begin with one or more design solutions supplied by the user and then iteratively check new design solutions, relative search spaces in order to achieve the true optimum solution. A mathematical model has been developed using response surface analysis for surface roughness and the model was validated using published results from literature.Methodologies in optimization such as Simulated annealing (SA), Particle Swarm Optimization (PSO), Conventional Genetic Algorithm (CGA) and Improved Genetic Algorithm (IGA) are applied to optimize machining parameters while dry turning of SS420 material. All the above algorithms were tested for their efficiency, robustness and accuracy and observe how they often outperform conventional optimization method applied to difficult real world problems. The SA, PSO, CGA and IGA codes were developed using MATLAB. For each evolutionary algorithmic method, optimum cutting conditions are provided to achieve better surface finish.The computational results using SA clearly demonstrated that the proposed solution procedure is quite capable in solving such complicated problems effectively and efficiently. Particle Swarm Optimization (PSO) is a relatively recent heuristic search method whose mechanics are inspired by the swarming or collaborative behavior of biological populations. From the results it has been observed that PSO provides better results and also more computationally efficient.Based on the results obtained using CGA and IGA for the optimization of machining process, the proposed IGA provides better results than the conventional GA. The improved genetic algorithm incorporating a stochastic crossover technique and an artificial initial population scheme is developed to provide a faster search mechanism. Finally, a comparison among these algorithms were made for the specific example of dry turning of SS 420 material and arriving at optimum machining parameters of feed, cutting speed, depth of cut and tool nose radius for minimum surface roughness as the criterion. To summarize, the research work fills in conspicuous gaps between research prototypes and industry requirements, by simulating evolutionary procedures seen in nature that optimize its own systems.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Introduction. Duchenne and Becker Muscular Dystrophies (DMD/DMB) are X-linked recessive diseases characterized by progressive muscle weakness and wasting, loss of motor skills and death after the second decade of life. Deletions are the most prevalent mutations that affect the dystrophin gene, which spans 79 exons.Objective: Identify deletions on the dystrophin gene in 58 patients affected with DMD.Methods: Through multiplex PCR identify deletions on the dystrophin gene in 58 patients with DMD and observe the frequency of this mutation in our population.Results: We found deletions in 1.72% of patients (1 of 58 persons). Deletions were not the principal cause of disease in our population. It is possible that duplications and point mutations caused this illness in our patients.Conclusions: The frequency of deletions in the 15 exons analyzed from the dystrophin gene was low. The predominant types of mutation in our patients` samples were not deletions as has been observed in the literature worldwide, therefore, it is important to determine other types of mutations as are duplications and point mutations.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The objective of this study was to investigate the spatial and temporal distribution of Libinia spinosa H. MILNE EDWARDS 1834 on unconsolidated sublittoral bottoms in two regions off the northern coast of the state of São Paulo, and to analyze the influence of environmental factors on the occurrence of this species and observe the recruitment pattern of its young. Crabs were collected monthly (July 2001 through June 2003) at depths of 5, 10, 15, 20, 25, 30 and 35 In, from a fishing boat equipped with two double-rig nets. Samples of water and sediment were collected for analysis of environmental factors. A total of 2112 spider crabs was obtained (701 juveniles and 1411 adults). The highest abundance was observed at depths of 20 and 25 m, in both regions. These localities were characterized by substrate composed of very fine sand and silt-clay. In regard to the temporal distribution, juveniles and adults predominated in the summer and winter months respectively. From these results, one can infer that the distribution of L. spinosa is related to environmental factors favorable for its life cycle; sediment type is the factor which most strongly determines its presence. © E. Schweizerbart'sche Verlagsbuchhandlung (Nägele u. Obermiller), 2007.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A correlated two-body basis function is used to describe the three-dimensional bosonic clusters interacting via two-body van der Waals potential. We calculate the ground state and the zero orbital angular momentum excited states for Rb-N clusters with up to N = 40. We solve the many-particle Schrodinger equation by potential harmonics expansion method, which keeps all possible two-body correlations in the calculation and determines the lowest effective many-body potential. We study energetics and structural properties for such diffuse clusters both at dimer and tuned scattering length. The motivation of the present study is to investigate the possibility of formation of N-body clusters interacting through the van der Waals interaction. We also compare the system with the well studied He, Ne, and Ar clusters. We also calculate correlation properties and observe the generalised Tjon line for large cluster. We test the validity of the shape-independent potential in the calculation of the ground state energy of such diffuse cluster. These are the first such calculations reported for Rb clusters. (C) 2012 American Institute of Physics. [http://dx.doi.org/10.1063/1.4730972]

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Metastasis is the complex process of tumor cell spread which is responsible for the majority of cancer-related deaths. Metastasis necessitates complex phenotypic changes, many of which are mediated by changes in the activities of cell surface molecules. One of these is cell surface $\beta$1,4-galactosyltransferase (GalTase), which is elevated on more highly metastatic cells. In this study, both molecular and biochemical methods were used to perturb and manipulate cell surface GalTase levels on K1735 murine melanoma cell lines in order to examine its function in metastasis.^ As expected, highly metastatic K1735 variants have higher cell surface GalTase than poorly metastatic variants. Stably transfected K1735 cell lines that overexpress surface GalTase were created. These cell lines were assayed for metastatic ability using an invasion chamber with Matrigel-coated filter inserts. Cells with increased surface GalTase were uniformly more invasive than neo transfected controls. With multiple cell lines, there was a direct correlation (r = 0.918) between surface GalTase activity and invasiveness. Homologous recombination was used to create K1735 cells with decreased levels of surface GalTase. These cells were uniformly less invasive than controls. Cell surface GalTase was inhibited using two different biochemical strategies. In both cases, inhibition of surface GalTase led to a decrease in in vivo metastatic ability of K1735 cells. This is the first direct experimental evidence addressing GalTase function in metastasis. These data provide several lines of independent evidence which show that cell surface GalTase facilitates invasion and metastasis. ^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Ocean biogeochemical and ecosystem processes are linked by net primary production (NPP) in the ocean's surface layer, where inorganic carbon is fixed by photosynthetic processes. Determinations of NPP are necessarily a function of phytoplankton biomass and its physiological status, but the estimation of these two terms from space has remained an elusive target. Here we present new satellite ocean color observations of phytoplankton carbon (C) and chlorophyll (Chl) biomass and show that derived Chl:C ratios closely follow anticipated physiological dependencies on light, nutrients, and temperature. With this new information, global estimates of phytoplankton growth rates (mu) and carbon-based NPP are made for the first time. Compared to an earlier chlorophyll-based approach, our carbon-based values are considerably higher in tropical oceans, show greater seasonality at middle and high latitudes, and illustrate important differences in the formation and demise of regional algal blooms. This fusion of emerging concepts from the phycological and remote sensing disciplines has the potential to fundamentally change how we model and observe carbon cycling in the global oceans.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Complex diseases such as cancer result from multiple genetic changes and environmental exposures. Due to the rapid development of genotyping and sequencing technologies, we are now able to more accurately assess causal effects of many genetic and environmental factors. Genome-wide association studies have been able to localize many causal genetic variants predisposing to certain diseases. However, these studies only explain a small portion of variations in the heritability of diseases. More advanced statistical models are urgently needed to identify and characterize some additional genetic and environmental factors and their interactions, which will enable us to better understand the causes of complex diseases. In the past decade, thanks to the increasing computational capabilities and novel statistical developments, Bayesian methods have been widely applied in the genetics/genomics researches and demonstrating superiority over some regular approaches in certain research areas. Gene-environment and gene-gene interaction studies are among the areas where Bayesian methods may fully exert its functionalities and advantages. This dissertation focuses on developing new Bayesian statistical methods for data analysis with complex gene-environment and gene-gene interactions, as well as extending some existing methods for gene-environment interactions to other related areas. It includes three sections: (1) Deriving the Bayesian variable selection framework for the hierarchical gene-environment and gene-gene interactions; (2) Developing the Bayesian Natural and Orthogonal Interaction (NOIA) models for gene-environment interactions; and (3) extending the applications of two Bayesian statistical methods which were developed for gene-environment interaction studies, to other related types of studies such as adaptive borrowing historical data. We propose a Bayesian hierarchical mixture model framework that allows us to investigate the genetic and environmental effects, gene by gene interactions (epistasis) and gene by environment interactions in the same model. It is well known that, in many practical situations, there exists a natural hierarchical structure between the main effects and interactions in the linear model. Here we propose a model that incorporates this hierarchical structure into the Bayesian mixture model, such that the irrelevant interaction effects can be removed more efficiently, resulting in more robust, parsimonious and powerful models. We evaluate both of the 'strong hierarchical' and 'weak hierarchical' models, which specify that both or one of the main effects between interacting factors must be present for the interactions to be included in the model. The extensive simulation results show that the proposed strong and weak hierarchical mixture models control the proportion of false positive discoveries and yield a powerful approach to identify the predisposing main effects and interactions in the studies with complex gene-environment and gene-gene interactions. We also compare these two models with the 'independent' model that does not impose this hierarchical constraint and observe their superior performances in most of the considered situations. The proposed models are implemented in the real data analysis of gene and environment interactions in the cases of lung cancer and cutaneous melanoma case-control studies. The Bayesian statistical models enjoy the properties of being allowed to incorporate useful prior information in the modeling process. Moreover, the Bayesian mixture model outperforms the multivariate logistic model in terms of the performances on the parameter estimation and variable selection in most cases. Our proposed models hold the hierarchical constraints, that further improve the Bayesian mixture model by reducing the proportion of false positive findings among the identified interactions and successfully identifying the reported associations. This is practically appealing for the study of investigating the causal factors from a moderate number of candidate genetic and environmental factors along with a relatively large number of interactions. The natural and orthogonal interaction (NOIA) models of genetic effects have previously been developed to provide an analysis framework, by which the estimates of effects for a quantitative trait are statistically orthogonal regardless of the existence of Hardy-Weinberg Equilibrium (HWE) within loci. Ma et al. (2012) recently developed a NOIA model for the gene-environment interaction studies and have shown the advantages of using the model for detecting the true main effects and interactions, compared with the usual functional model. In this project, we propose a novel Bayesian statistical model that combines the Bayesian hierarchical mixture model with the NOIA statistical model and the usual functional model. The proposed Bayesian NOIA model demonstrates more power at detecting the non-null effects with higher marginal posterior probabilities. Also, we review two Bayesian statistical models (Bayesian empirical shrinkage-type estimator and Bayesian model averaging), which were developed for the gene-environment interaction studies. Inspired by these Bayesian models, we develop two novel statistical methods that are able to handle the related problems such as borrowing data from historical studies. The proposed methods are analogous to the methods for the gene-environment interactions on behalf of the success on balancing the statistical efficiency and bias in a unified model. By extensive simulation studies, we compare the operating characteristics of the proposed models with the existing models including the hierarchical meta-analysis model. The results show that the proposed approaches adaptively borrow the historical data in a data-driven way. These novel models may have a broad range of statistical applications in both of genetic/genomic and clinical studies.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper presents ASYTRAIN, a new tool to teach and learn antennas, based on the use of a modular building kit and a low cost portable antenna measurement system that lets the students design and build different types of antennas and observe their characteristics while learning the insights of the subjects. This tool has a methodology guide for try-and-test project development and, makes the students be active antenna engineers instead of passive learners. This experimental learning method arises their motivation during the antenna courses.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Esta tesis se centra en el análisis de dos aspectos complementarios de la ciberdelincuencia (es decir, el crimen perpetrado a través de la red para ganar dinero). Estos dos aspectos son las máquinas infectadas utilizadas para obtener beneficios económicos de la delincuencia a través de diferentes acciones (como por ejemplo, clickfraud, DDoS, correo no deseado) y la infraestructura de servidores utilizados para gestionar estas máquinas (por ejemplo, C & C, servidores explotadores, servidores de monetización, redirectores). En la primera parte se investiga la exposición a las amenazas de los ordenadores victimas. Para realizar este análisis hemos utilizado los metadatos contenidos en WINE-BR conjunto de datos de Symantec. Este conjunto de datos contiene metadatos de instalación de ficheros ejecutables (por ejemplo, hash del fichero, su editor, fecha de instalación, nombre del fichero, la versión del fichero) proveniente de 8,4 millones de usuarios de Windows. Hemos asociado estos metadatos con las vulnerabilidades en el National Vulnerability Database (NVD) y en el Opens Sourced Vulnerability Database (OSVDB) con el fin de realizar un seguimiento de la decadencia de la vulnerabilidad en el tiempo y observar la rapidez de los usuarios a remiendar sus sistemas y, por tanto, su exposición a posibles ataques. Hemos identificado 3 factores que pueden influir en la actividad de parches de ordenadores victimas: código compartido, el tipo de usuario, exploits. Presentamos 2 nuevos ataques contra el código compartido y un análisis de cómo el conocimiento usuarios y la disponibilidad de exploit influyen en la actividad de aplicación de parches. Para las 80 vulnerabilidades en nuestra base de datos que afectan código compartido entre dos aplicaciones, el tiempo entre el parche libera en las diferentes aplicaciones es hasta 118 das (con una mediana de 11 das) En la segunda parte se proponen nuevas técnicas de sondeo activos para detectar y analizar las infraestructuras de servidores maliciosos. Aprovechamos técnicas de sondaje activo, para detectar servidores maliciosos en el internet. Empezamos con el análisis y la detección de operaciones de servidores explotadores. Como una operación identificamos los servidores que son controlados por las mismas personas y, posiblemente, participan en la misma campaña de infección. Hemos analizado un total de 500 servidores explotadores durante un período de 1 año, donde 2/3 de las operaciones tenían un único servidor y 1/2 por varios servidores. Hemos desarrollado la técnica para detectar servidores explotadores a diferentes tipologías de servidores, (por ejemplo, C & C, servidores de monetización, redirectores) y hemos logrado escala de Internet de sondeo para las distintas categorías de servidores maliciosos. Estas nuevas técnicas se han incorporado en una nueva herramienta llamada CyberProbe. Para detectar estos servidores hemos desarrollado una novedosa técnica llamada Adversarial Fingerprint Generation, que es una metodología para generar un modelo único de solicitud-respuesta para identificar la familia de servidores (es decir, el tipo y la operación que el servidor apartenece). A partir de una fichero de malware y un servidor activo de una determinada familia, CyberProbe puede generar un fingerprint válido para detectar todos los servidores vivos de esa familia. Hemos realizado 11 exploraciones en todo el Internet detectando 151 servidores maliciosos, de estos 151 servidores 75% son desconocidos a bases de datos publicas de servidores maliciosos. Otra cuestión que se plantea mientras se hace la detección de servidores maliciosos es que algunos de estos servidores podrán estar ocultos detrás de un proxy inverso silente. Para identificar la prevalencia de esta configuración de red y mejorar el capacidades de CyberProbe hemos desarrollado RevProbe una nueva herramienta a través del aprovechamiento de leakages en la configuración de la Web proxies inversa puede detectar proxies inversos. RevProbe identifica que el 16% de direcciones IP maliciosas activas analizadas corresponden a proxies inversos, que el 92% de ellos son silenciosos en comparación con 55% para los proxies inversos benignos, y que son utilizado principalmente para equilibrio de carga a través de múltiples servidores. ABSTRACT In this dissertation we investigate two fundamental aspects of cybercrime: the infection of machines used to monetize the crime and the malicious server infrastructures that are used to manage the infected machines. In the first part of this dissertation, we analyze how fast software vendors apply patches to secure client applications, identifying shared code as an important factor in patch deployment. Shared code is code present in multiple programs. When a vulnerability affects shared code the usual linear vulnerability life cycle is not anymore effective to describe how the patch deployment takes place. In this work we show which are the consequences of shared code vulnerabilities and we demonstrate two novel attacks that can be used to exploit this condition. In the second part of this dissertation we analyze malicious server infrastructures, our contributions are: a technique to cluster exploit server operations, a tool named CyberProbe to perform large scale detection of different malicious servers categories, and RevProbe a tool that detects silent reverse proxies. We start by identifying exploit server operations, that are, exploit servers managed by the same people. We investigate a total of 500 exploit servers over a period of more 13 months. We have collected malware from these servers and all the metadata related to the communication with the servers. Thanks to this metadata we have extracted different features to group together servers managed by the same entity (i.e., exploit server operation), we have discovered that 2/3 of the operations have a single server while 1/3 have multiple servers. Next, we present CyberProbe a tool that detects different malicious server types through a novel technique called adversarial fingerprint generation (AFG). The idea behind CyberProbe’s AFG is to run some piece of malware and observe its network communication towards malicious servers. Then it replays this communication to the malicious server and outputs a fingerprint (i.e. a port selection function, a probe generation function and a signature generation function). Once the fingerprint is generated CyberProbe scans the Internet with the fingerprint and finds all the servers of a given family. We have performed a total of 11 Internet wide scans finding 151 new servers starting with 15 seed servers. This gives to CyberProbe a 10 times amplification factor. Moreover we have compared CyberProbe with existing blacklists on the internet finding that only 40% of the server detected by CyberProbe were listed. To enhance the capabilities of CyberProbe we have developed RevProbe, a reverse proxy detection tool that can be integrated with CyberProbe to allow precise detection of silent reverse proxies used to hide malicious servers. RevProbe leverages leakage based detection techniques to detect if a malicious server is hidden behind a silent reverse proxy and the infrastructure of servers behind it. At the core of RevProbe is the analysis of differences in the traffic by interacting with a remote server.