945 resultados para short-time dynamics
Resumo:
La información y los datos genéticos que emanan hoy de las investigaciones del genoma humano demandan el desarrollo de herramientas informáticas capaces de procesar la gran cantidad de información disponible. La mayor cantidad de datos genéticos es el resultado de equipos que realizan el análisis simultáneo de cientos o miles de polimorfismos o variaciones genéticas, de nuevas técnicas de laboratorio de mayor rendimiento que, en conjunto, ofrecen una mayor disponibilidad de información en un corto espacio de tiempo. Esta problemática conduce a la necesidad de desarrollar nuevas herramientas informáticas capaces de lidiar con este mayor volumen de datos genéticos. En el caso de la genética de poblaciones, a pesar de que existen herramientas informáticas que permiten procesar y facilitar el análisis de los datos, estas tienen limitaciones como la falta de conocimiento de los usuarios de algunos lenguajes de programación para alimentar la información y otras herramientas informáticas no realizan todas las estimaciones que se requieren y otros presentan limitaciones en cuanto al número de datos que pueden incorporar o manejar. En algunos casos hay redundancia al tener que usarse dos o más herramientas para poder procesar un conjunto de datos de información genética. El presente trabajo tiene por objetivo el desarrollo de una herramienta informática basada en aplicaciones de computador comunes, en este caso Microsoft Excel® y que resuelva todos los problemas y las limitaciones descritas antes. El desarrollo del conjunto de subprogramas que constituyen a Lustro; permiten superar lo anterior, presentar los resultados en un ambiente sencillo, conocido y fácil de operar, simplificando de esta forma el proceso de adaptación del usuario del programa, sin entrenamiento previo, obteniéndose en corto tiempo el procesamiento de la información genética de interés.
Resumo:
Las empresas son creadas por personas para las personas, y son estas quienes se encargan de crear, hacer surgir, modificar y gestionar cada uno de los conceptos de empresa, para así lograr modificar caminos y lograr ser perdurables. Son fundamentales las personas que componen la organización, aclarando que existen muchos factores internos - externos que influyen de diversas maneras en la organización. Flexibilidad y capacidad de respuesta son elementos claves, estos se reflejan con más importancia cuando de diseñar estrategia se trata. Es claro que en todos los sectores de la economía mundial las empresas unen esfuerzos para crear nuevos modelos de respuesta para así hacerle frente a cualquier circunstancia que se pueda presentar. Tener una sencilla pero muy completa idea, SER LA CADENA MAS GRANDE DE PANADERÍAS, que con el tiempo alcanzo una perfección optima, logrando ser perdurable, en un sector que claramente está saturado por la falta de innovación y ventajas competitivas. Ventajas que PAN PA YA, supo aprovechar, saliéndose del esquema tradicional y buscando nuevas estrategias de venta, nuevos productos teniendo como escudo protector la calidad de sus productos, y la innovación como su principal arma. El éxito de PAN PA YA, en gran medida se dio gracias a la creación de la necesidad de un producto de calidad; innovación, capacitación de personal, desarrollo de nuevos y exquisitos productos, ingredientes sencillos para una receta que cualquiera puede tener y que les permitió lograr en muy poco tiempo consolidar una exitosa empresa.
Resumo:
Esta monografía surge del cuestionamiento sobre las razones que llevaron a Estados Unidos a desarrollar en muy corto tiempo una gran cantidad de relaciones, muchas de ellas de gran relevancia, con un Estado que generalmente careció de importancia dentro de su política exterior, India. Se plantea que la razón tras el surgimiento de dichas relaciones es la nueva distribución del poder en la región de Asia, la importancia de China y las nacientes relaciones entre China, Rusia e India.
Resumo:
En el marco teórico foucaultiano la subjetividad se entiende como un elemento que da cuenta de la experiencia de los sujetos, respondiendo a particularidades que se presentan en cada momento histórico. Se identifican tres ejes de análisis para el estudio de la subjetividad, los cuales son el saber, el poder y la verdad. Cada uno de ellos usa metodologías particulares para el abordaje de la subjetividad, así entonces se usa la arqueología, la genealogía y la ontología respectivamente. En este texto se realiza una exploración del concepto de subjetividad desde la obra de M. Foucault, con el fin de encontrar una metodología para el estudio e investigación de la subjetividad como concepto de vital importancia para el abordaje de fenómenos sociales y culturales. En un primer momento se describen dinámicas correspondientes al saber y los juegos de veridicción, en un segundo momento se describen dinámicas correspondientes al poder, donde se describen las fuerzas que intervienen en la disposición de un ordenamiento particular de la experiencia. Por último en un tercer lugar, se da paso al concepto de la verdad en relación con la constitución subjetiva, la relación entre los juegos de poder y saber, como sistemas de ordenamiento y la verdad como efecto de los mismos, de esta manera se habla del foco de experiencia como unidad y elemento objeto de análisis.
Resumo:
We examine the long-run relationship between the parallel and the official exchange rate in Colombia over two regimes; a crawling peg period and a more flexible crawling band one. The short-run adjustment process of the parallel rate is examined both in a linear and a nonlinear context. We find that the change from the crawling peg to the crawling band regime did not affect the long-run relationship between the official and parallel exchange rates, but altered the short-run dynamics. Non-linear adjustment seems appropriate for the first period, mainly due to strict foreign controls that cause distortions in the transition back to equilibrium once disequilibrium occurs
Resumo:
A simple and most promising oxide-assisted catalyst-free method is used to prepare silicon nitride nanowires that give rise to high yield in a short time. After a brief analysis of the state of the art, we reveal the crucial role played by the oxygen partial pressure: when oxygen partial pressure is slightly below the threshold of passive oxidation, a high yield inhibiting the formation of any silica layer covering the nanowires occurs and thanks to the synthesis temperature one can control nanowire dimensions
Resumo:
RESUMO: O conhecimento existe desde sempre, mesmo num estado latente condicionado algures e apenas à espera de um meio (de uma oportunidade) de se poder manifestar. O conhecimento é duplamente um fenómeno da consciência: porque dela procede num dado momento da sua vida e da sua história e porque só nela termina, aperfeiçoando-a e enriquecendo-a. O conhecimento está assim em constante mudança. À relativamente pouco tempo começou-se a falar de Gestão do Conhecimento e na altura foi muito associada às Tecnologias da Informação, como meio de colectar, processar e armazenar cada vez mais, maiores quantidades de informação. As Tecnologias da Informação têm tido, desde alguns anos para cá, um papel extremamente importante nas organizações, inicialmente foram adoptadas com o propósito de automatizar os processos operacionais das organizações, que suportam as suas actividades quotidianas e nestes últimos tempos as Tecnologias da Informação dentro das organizações têm evoluído rapidamente. Todo o conhecimento, mesmo até o menos relevante de uma determinada área de negócio, é fundamental para apoiar o processo de tomada de decisão. As organizações para atingirem melhores «performances» e conseguirem transcender as metas a que se propuseram inicialmente, tendem a munir-se de mais e melhores Sistemas de Informação, assim como, à utilização de várias metodologias e tecnologias hoje em dia disponíveis. Por conseguinte, nestes últimos anos, muitas organizações têm vindo a demonstrar uma necessidade crucial de integração de toda a sua informação, a qual está dispersa pelos diversos departamentos constituintes. Para que os gestores de topo (mas também para outros funcionários) possam ter disponível em tempo útil, informação pertinente, verdadeira e fiável dos negócios da organização que eles representam, precisam de ter acesso a bons Sistemas de Tecnologias de Informação. Numa acção de poderem agir mais eficazmente e eficientemente nas tomadas de decisão, por terem conseguido tirar por esses meios o máximo de proveito possível da informação, e assim, apresentarem melhores níveis de sucesso organizacionais. Também, os Sistemas de «Business Intelligence» e as Tecnologias da Informação a ele associadas, utilizam os dados existentes nas organizações para disponibilizar informação relevante para as tomadas de decisão. Mas, para poderem alcançar esses níveis tão satisfatórios, as organizações necessitam de recursos humanos, pois como podem elas serem competitivas sem Luís Miguel Borges – Gestão e Trabalhadores do Conhecimento em Tecnologias da Informação (UML) ULHT – ECATI 6 trabalhadores qualificados. Assim, surge a necessidade das organizações em recrutar os chamados hoje em dia “Trabalhadores do Conhecimento”, que são os indivíduos habilitados para interpretar as informações dentro de um domínio específico. Eles detectam problemas e identificam alternativas, com os seus conhecimentos e discernimento, eles trabalham para solucionar esses problemas, ajudando consideravelmente as organizações que representam. E, usando metodologias e tecnologias da Engenharia do Conhecimento como a modelação, criarem e gerirem um histórico de conhecimento, incluindo conhecimento tácito, sobre várias áreas de negócios da organização, que podem estar explícitos em modelos abstractos, que possam ser compreendidos e interpretados facilmente, por outros trabalhadores com níveis de competência equivalentes. ABSTRACT: Knowledge has always existed, even in a latent state conditioning somewhere and just waiting for a half (an opportunity) to be able to manifest. Knowledge is doubly a phenomenon of consciousness: because proceeds itself at one point in its life and its history and because solely itself ends, perfecting it and enriching it. The knowledge is so in constant change. In the relatively short time that it began to speak of Knowledge Management and at that time was very associated with Information Technologies, as a means to collect, process and store more and more, larger amounts of information. Information Technologies has had, from a few years back, an extremely important role in organizations, were initially adopted in order to automate the operational processes of organizations, that support their daily activities and in recent times Information Technologies within organizations has evolved rapidly. All the knowledge, even to the least relevant to a particular business area, is fundamental to support the process of decision making. The organizations to achieve better performances and to transcend the goals that were initially propose, tend to provide itself with more and better Information Systems, as well as, the use of various methodologies and technologies available today. Consequently, in recent years, many organizations have demonstrated a crucial need for integrating all their information, which is dispersed by the diver constituents departments. For top managers (but also for other employees) may have ready in time, pertinent, truthful and reliable information of the organization they represent, need access to good Information Technology Systems. In an action that they can act more effectively and efficiently in decision making, for having managed to get through these means the maximum possible advantage of the information, and so, present better levels of organizational success. Also, the systems of Business Intelligence and Information Technologies its associated, use existing data on organizations to provide relevant information for decision making. But, in order to achieve these levels as satisfactory, organizations need human resources, because how can they be competitive without skilled workers. Thus, arises the need for organizations to recruit called today “Knowledge Workers”, they are the individuals enable to interpret the information within a specific domain. They detect problems and identify alternatives, with their knowledge and discernment they work to solve these problems, helping considerably the organizations that represent. And, using Luís Miguel Borges – Gestão e Trabalhadores do Conhecimento em Tecnologias da Informação (UML) ULHT – ECATI 8 methodologies and technologies of Knowledge Engineering as modeling, create and manage a history of knowledge, including tacit knowledge, on various business areas of the organization, that can be explicit in the abstract models, that can be understood and interpreted easily, by other workers with equivalent levels of competence.
Resumo:
Este trabalho discute a liberalização do transporte aéreo no mercado Europeu, o sucesso do tráfego aéreo, e a importância da regulação de slots aeroportuários para uma concorrência leal. Presentemente, dado o crescimento e a expectativa que o tráfego aéreo cresça ainda mais, os aeroportos e as companhias aéreas debatem-se com um problema de grande dimensão, que passa pela gestão aeroportuária de slots, uma vez que a capacidade dos aeroportos é limitada, e a procura por estas estruturas tem aumentado drasticamente. Face aos vários aeroportos Europeus, estima-se que essa capacidade seja atingida brevemente, fazendo com que a congestão não se dê só apenas por um curto período diário, mas que aumente não só a sua frequência como também a sua duração. Esta congestão pode ter efeitos sobre a concorrência entre as companhias e entre os aeroportos, o que implica que haja um impacto negativo, tanto no ambiente como na segurança. O estudo terá em atenção a capacidade do Aeroporto de Lisboa, ao pedido de slots aeroportuários, a legislação nacional e internacional, outros meios de afectação de slots aeroportuários, a posição da TAP relativamente à regulamentação existente, e aos meios paralelos.
Resumo:
The Covered Catchment Experiment at Gordsjon is a large scale forest ecosystem manipulation, where acid precipitation was intercepted by a 7000 m(2) plastic roof and replaced by 'clean precipitation' sprinkled below the roof for ten years between 1991 and 2001. The treatment resulted in a strong positive response of runoff quality. The runoff sulphate, inorganic aluminium and base cations decreased, while there was a strong increase in runoff ANC and a moderate increase in pH. The runoff continued to improve over the whole duration of the experiment. The achieved quality was, however, after ten years still considerably worse than estimated pre-industrial runoff at the site. Stable isotopes of sulphur were analysed to study the soil sulphur cycling. At the initial years of the experiment, the desorption of SO4 from the mineral soil appeared to control the runoff SO4 concentration. However, as the experiment proceeded, there was growing evidence that net mineralisation of soil organic sulphur in the humus layer was an additional source of SO4 in runoff. This might provide a challenge to current acidification models. The experiment convincingly demonstrated on a catchment scale, that reduction in acid deposition causes an immediate improvement of surface water quality even at heavily acidified sites. The improvement of the runoff appeared to be largely a result of cation exchange processes in the soil due to decreasing concentrations of the soil solution, while any potential change in soil base saturation seemed to be less important for the runoff chemistry over the short time period of one decade. These findings should be considered when interpreting and extrapolating regional trends in surface water chemistry to the terrestrial parts of ecosystems.
Resumo:
One of the major uncertainties in the ability to predict future climate change, and hence its impacts, is the lack of knowledge of the earth's climate sensitivity. Here, data are combined from the 1985-96 Earth Radiation Budget Experiment (ERBE) with surface temperature change information and estimates of radiative forcing to diagnose the climate sensitivity. Importantly, the estimate is completely independent of climate model results. A climate feedback parameter of 2.3 +/- 1.4 W m(-2) K-1 is found. This corresponds to a 1.0-4.1-K range for the equilibrium warming due to a doubling of carbon dioxide (assuming Gaussian errors in observable parameters, which is approximately equivalent to a uniform "prior" in feedback parameter). The uncertainty range is due to a combination of the short time period for the analysis as well as uncertainties in the surface temperature time series and radiative forcing time series, mostly the former. Radiative forcings may not all be fully accounted for; however, all argument is presented that the estimate of climate sensitivity is still likely to be representative of longer-term climate change. The methodology can be used to 1) retrieve shortwave and longwave components of climate feedback and 2) suggest clear-sky and cloud feedback terms. There is preliminary evidence of a neutral or even negative longwave feedback in the observations, suggesting that current climate models may not be representing some processes correctly if they give a net positive longwave feedback.
Resumo:
We present results from fast-response wind measurements within and above a busy intersection between two street canyons (Marylebone Road and Gloucester Place) in Westminster, London taken as part of the DAPPLE (Dispersion of Air Pollution and Penetration into the Local Environment; www.dapple.org.uk) 2007 field campaign. The data reported here were collected using ultrasonic anemometers on the roof-top of a building adjacent to the intersection and at two heights on a pair of lamp-posts on opposite sides of the intersection. Site characteristics, data analysis and the variation of intersection flow with the above-roof wind direction (θref) are discussed. Evidence of both flow channelling and recirculation was identified within the canyon, only a few metres from the intersection for along-street and across-street roof-top winds respectively. Results also indicate that for oblique rooftop flows, the intersection flow is a complex combination of bifurcated channelled flows, recirculation and corner vortices. Asymmetries in local building geometry around the intersection and small changes in the background wind direction (changes in 15-min mean θref of 5–10 degrees) were also observed to have profound influences on the behaviour of intersection flow patterns. Consequently, short time-scale variability in the background flow direction can lead to highly scattered in-street mean flow angles masking the true multi-modal features of the flow and thus further complicating modelling challenges.
Resumo:
New conceptual ideas on network architectures have been proposed in the recent past. Current store-andforward routers are replaced by active intermediate systems, which are able to perform computations on transient packets, in a way that results very helpful for developing and deploying new protocols in a short time. This paper introduces a new routing algorithm, based on a congestion metric, and inspired by the behavior of ants in nature. The use of the Active Networks paradigm associated with a cooperative learning environment produces a robust, decentralized algorithm capable of adapting quickly to changing conditions.
Resumo:
Background: We report an analysis of a protein network of functionally linked proteins, identified from a phylogenetic statistical analysis of complete eukaryotic genomes. Phylogenetic methods identify pairs of proteins that co-evolve on a phylogenetic tree, and have been shown to have a high probability of correctly identifying known functional links. Results: The eukaryotic correlated evolution network we derive displays the familiar power law scaling of connectivity. We introduce the use of explicit phylogenetic methods to reconstruct the ancestral presence or absence of proteins at the interior nodes of a phylogeny of eukaryote species. We find that the connectivity distribution of proteins at the point they arise on the tree and join the network follows a power law, as does the connectivity distribution of proteins at the time they are lost from the network. Proteins resident in the network acquire connections over time, but we find no evidence that 'preferential attachment' - the phenomenon of newly acquired connections in the network being more likely to be made to proteins with large numbers of connections - influences the network structure. We derive a 'variable rate of attachment' model in which proteins vary in their propensity to form network interactions independently of how many connections they have or of the total number of connections in the network, and show how this model can produce apparent power-law scaling without preferential attachment. Conclusion: A few simple rules can explain the topological structure and evolutionary changes to protein-interaction networks: most change is concentrated in satellite proteins of low connectivity and small phenotypic effect, and proteins differ in their propensity to form attachments. Given these rules of assembly, power law scaled networks naturally emerge from simple principles of selection, yielding protein interaction networks that retain a high-degree of robustness on short time scales and evolvability on longer evolutionary time scales.
Resumo:
Many nations are experiencing rapid rises in the life expectancy of their citizens. The implications of this major demographic shift are considerable offering opportunities as well as challenges to reconsider how people should spend their later years. A key task is enhancing the quality of life of older people through enabling them to continue to live independently even though illness, accident or frailty may have severely reduced their physical and sensory abilities and, possibly, mental health. Yet the needs of older people and disabled people have been largely ignored in the design of everyday consumer products, the home, transport systems and the built environment in general. Whilst the need for designers, engineers and technologists to provide products, environments and systems which are inclusive of all members of society is widely accepted, there is little understanding of how this can be achieved. In 1998 the UK Engineering and Physical Sciences Research Council established its EQUAL Initiative. This has encouraged design, engineering and technology researchers in universities to join with their colleagues from the social, medical and health sciences to investigate a wide range of issues experienced by older and disabled people and to propose solutions. Their research, which directly involves older and disabled people and, for example, social housing providers, social services departments, charities, engineering and architectural consultants, and transport firms, has been extremely successful. In a very short time it has influenced government policy on housing, long-term care, and building standards, and findings have been taken up by architects, designers, health-care professionals and bodies which represent older and disabled people.
Resumo:
External interferences can severely degrade the performance of an Over-the-horizon radar (OTHR), so suppression of external interferences in strong clutter environment is the prerequisite for the target detection. The traditional suppression solutions usually began with clutter suppression in either time or frequency domain, followed by the interference detection and suppression. Based on this traditional solution, this paper proposes a method characterized by joint clutter suppression and interference detection: by analyzing eigenvalues in a short-time moving window centered at different time position, Clutter is suppressed by discarding the maximum three eigenvalues at every time position and meanwhile detection is achieved by analyzing the remained eigenvalues at different position. Then, restoration is achieved by forward-backward linear prediction using interference-free data surrounding the interference position. In the numeric computation, the eigenvalue decomposition (EVD) is replaced by values decomposition (SVD) based on the equivalence of these two processing. Data processing and experimental results show its efficiency of noise floor falling down about 10-20 dB.