995 resultados para 280301 Programming Techniques
Resumo:
A decision theory framework can be a powerful technique to derive optimal management decisions for endangered species. We built a spatially realistic stochastic metapopulation model for the Mount Lofty Ranges Southern Emu-wren (Stipiturus malachurus intermedius), a critically endangered Australian bird. Using diserete-time Markov,chains to describe the dynamics of a metapopulation and stochastic dynamic programming (SDP) to find optimal solutions, we evaluated the following different management decisions: enlarging existing patches, linking patches via corridors, and creating a new patch. This is the first application of SDP to optimal landscape reconstruction and one of the few times that landscape reconstruction dynamics have been integrated with population dynamics. SDP is a powerful tool that has advantages over standard Monte Carlo simulation methods because it can give the exact optimal strategy for every landscape configuration (combination of patch areas and presence of corridors) and pattern of metapopulation occupancy, as well as a trajectory of strategies. It is useful when a sequence of management actions can be performed over a given time horizon, as is the case for many endangered species recovery programs, where only fixed amounts of resources are available in each time step. However, it is generally limited by computational constraints to rather small networks of patches. The model shows that optimal metapopulation, management decisions depend greatly on the current state of the metapopulation,. and there is no strategy that is universally the best. The extinction probability over 30 yr for the optimal state-dependent management actions is 50-80% better than no management, whereas the best fixed state-independent sets of strategies are only 30% better than no management. This highlights the advantages of using a decision theory tool to investigate conservation strategies for metapopulations. It is clear from these results that the sequence of management actions is critical, and this can only be effectively derived from stochastic dynamic programming. The model illustrates the underlying difficulty in determining simple rules of thumb for the sequence of management actions for a metapopulation. This use of a decision theory framework extends the capacity of population viability analysis (PVA) to manage threatened species.
Resumo:
Chest clapping, vibration, and shaking were studied in 10 physiotherapists who applied these techniques on an anesthetized animal model. Hemodynamic variables (such as heart rate, blood pressure, pulmonary artery pressure, and right atrial pressure) were measured during the application of these techniques to verify claims of adverse events. In addition, expired tidal volume and peak expiratory flow rate were measured to ascertain effects of these techniques. Physiotherapists in this study applied chest clapping at a rate of 6.2 +/- 0.9 Hz, vibration at 10.5 +/- 2.3 Hz, and shaking at 6.2 +/- 2.3 Hz. With the use of these rates, esophageal pressure swings of 8.8 +/- 5.0, 0.7 +/- 0.3, and 1.4 +/- 0.7 mmHg resulted from clapping, vibration, and shaking respectively. Variability in rates and forces generated by these techniques was 80% of variance in shaking force (P = 0.003). Application of these techniques by physiotherapists was found to have no significant effects on hemodynamic and most ventilatory variables in this study. From this study, we conclude that chest clapping, vibration, and shaking 1) can be consistently performed by physiotherapists; 2) are significantly related to physiotherapists' characteristics, particularly clinical experience; and 3) caused no significant hemodynamic effects.
Resumo:
The left ventricular response to dobutamine may be quantified using tissue Doppler measurement of myocardial velocity or displacement or 3-dimensional echocardiography to measure ventricular volume and ejection fraction. This study sought to explore the accuracy of these methods for predicting segmental and global responses to therapy. Standard dobutamine and 3-dimensional echocardiography were performed in 92 consecutive patients with abnormal left ventricular function at rest. Recovery of function was defined by comparison with follow-up echocardiography at rest 5 months later. Segments that showed improved regional function at follow-up showed a higher increment in peak tissue Doppler velocity with dobutamine therapy than in nonviable segments (1.2 +/- 0.4 vs 0.3 +/- 0.2 cm/s, p = 0.001). Similarly, patients who showed a > 5% improvement of ejection fraction at follow-up showed a greater displacement response to dobutamine (6.9 +/- 3.2 vs 2.1 +/- 2.3 mm, p = 0.001), as well as a higher rate of ejection fraction, response to dobutamine (9 +/- 3% vs 2 +/- 2%, p = 0.001). The optimal cutoff values for predicting subsequent recovery of function at rest were an increment of peak velocity > 1 cm/s, >5 mm of displacement, and a >5% improvement of ejection fraction with low-dose dobutamine. (C) 2003 by Excerpta Medica, Inc.
Resumo:
The health benefits provided by probiotic bacteria have led to their increasing use in fermented and other dairy products. However, their viability in these products is low. Encapsulation has been investigated to protect the bacteria in the product's environment and improve their survival. There are two common encapsulation techniques, namely extrusion and emulsion, to encapsulate the probiotics for their use in the fermented and other dairy products. This review evaluates the merits and limitations of these two techniques, and also discusses the supporting materials and special treatments used in encapsulation processes. (C) 2003 Elsevier Science Ltd. All rights reserved.
Resumo:
O intuito inicial desta pesquisa foi acompanhar processos de trabalho à luz do referencial teórico da Ergologia e, portanto, concebendo o trabalho como relação dialética entre técnica e ação humana. O objetivo era cartografar o trabalho no processo de beneficiamento de granitos em uma organização de grande porte localizada no Espírito Santo e, após algum tempo em campo, o problema delineou- se do seguinte modo: como se constitui a competência industriosa no beneficiamento de granitos em uma organização de grande porte? A pesquisa justifica-se uma vez que, a despeito da relevância econômica, o cenário capixaba de rochas ornamentais apresenta problemas precários no que diz respeito à gestão. Para os Estudos Organizacionais, a relevância é reforçada pelo fato de aproximar desta área a abordagem ergológica e demarcar no debate sobre competência a noção de competência industriosa, ainda não explorada nesse campo de estudo. Para realização da pesquisa, foi praticada uma cartografia ergológica, a partir da articulação das pistas cartográficas com o referencial teórico-conceitual da Ergologia, sendo utilizadas como técnicas: observação participante durante 6 meses, com uma média de 3 visitas a campo por semana; 8 entrevistas semiestruturadas e em profundidade de cerca de 50 minutos cada com trabalhadores operacionais; uma entrevista com gerente de produção e outra com representante da área de Gestão de Pessoas; conversas com os demais trabalhadores, a fim de enriquecer o diário de campo; novas conversas e observações ao final da análise, para confrontação-validação com os trabalhadores. A sistematização dos procedimentos de análise pode ser assim descrita: a) leituras flutuantes com objetivo de fazer emergirem aspectos centrais relacionados às duas dimensões do trabalho, técnica e ação humana; b) leituras em profundidade com objetivo de fazer emergirem singularidades e especificidades relativas à dialética entre ambas; c) leituras em profundidade com objetivo de fazer emergirem aspectos relativos aos ingredientes da competência industriosa. A despeito da não delimitação de categorias analíticas e subcategorias, a partir da análise emergiram cinco eixos analíticos: 1) os procedimentos a serem empregados no processo de beneficiamento de granitos, englobando: as etapas do beneficiamento; as funções a serem desempenhadas e as tarefas a serem desenvolvidas; as normas regulamentadoras; os conhecimentos técnicos necessários para programação e operação de máquinas; a ordem de produção prescrita pelo setor comercial; 2) o trabalho real, diferenciado do trabalho como emprego de procedimentos pelo foco dado à ação humana no enfrentamento de situações reais, repletas de eventos e variabilidades, em todo o processo, englobando: preparo de carga; laminação; serrada; levigamento; resinagem; polimento-classificação; retoque; fechamento de pacote; ovada de contêiner; 3) diferentes modos de usos de si que, em tendência, são responsáveis pela constituição do agir em competência em cada etapa do processo, na dialética entre técnica e ação humana; 4) o modo como cada ingrediente da competência industriosa atua e se constitui, bem como sua concentração, em tendência, em cada etapa do processo, a partir dos tipos de usos de si que, também em tendência, são mais responsáveis pelo agir em competência, apresentando assim o perfil da competência industriosa no beneficiamento de granitos na empresa em análise; 5) dois possíveis fatores potencializadores dos ingredientes da competência industriosa, a saber, a transdução e os não-humanos. A partir de todo o exposto, as últimas considerações problematizam aspectos relativos ao debate sobre competências e práticas de gestão de pessoas a partir da competência compreendida da seguinte forma: mestria no ato de tirar partido do meio e de si para gerir situações de trabalho, em que a ação consiste na mobilização de recursos dificilmente perceptíveis e descritíveis, inerentes ao trabalhador, porém constituídos e manifestos por usos de si por si e pelos outros no e para o ato real de trabalho, marcadamente num nível infinitesimal, diante de situações que demandam aplicação de protocolos concomitante à gestão de variabilidades e eventos em parte inantecipáveis e inelimináveis.
Resumo:
Program slicing is a well known family of techniques used to identify code fragments which depend on or are depended upon specific program entities. They are particularly useful in the areas of reverse engineering, program understanding, testing and software maintenance. Most slicing methods, usually oriented towards the imperative or object paradigms, are based on some sort of graph structure representing program dependencies. Slicing techniques amount, therefore, to (sophisticated) graph transversal algorithms. This paper proposes a completely different approach to the slicing problem for functional programs. Instead of extracting program information to build an underlying dependencies’ structure, we resort to standard program calculation strategies, based on the so-called Bird-Meertens formalism. The slicing criterion is specified either as a projection or a hiding function which, once composed with the original program, leads to the identification of the intended slice. Going through a number of examples, the paper suggests this approach may be an interesting, even if not completely general, alternative to slicing functional programs
Resumo:
Program slicing is a well known family of techniques intended to identify and isolate code fragments which depend on, or are depended upon, specific program entities. This is particularly useful in the areas of reverse engineering, program understanding, testing and software maintenance. Most slicing methods, and corresponding tools, target either the imperative or the object oriented paradigms, where program slices are computed with respect to a variable or a program statement. Taking a complementary point of view, this paper focuses on the slicing of higher-order functional programs under a lazy evaluation strategy. A prototype of a Haskell slicer, built as proof-of-concept for these ideas, is also introduced
Resumo:
More and more current software systems rely on non trivial coordination logic for combining autonomous services typically running on different platforms and often owned by different organizations. Often, however, coordination data is deeply entangled in the code and, therefore, difficult to isolate and analyse separately. COORDINSPECTOR is a software tool which combines slicing and program analysis techniques to isolate all coordination elements from the source code of an existing application. Such a reverse engineering process provides a clear view of the actually invoked services as well as of the orchestration patterns which bind them together. The tool analyses Common Intermediate Language (CIL) code, the native language of Microsoft .Net Framework. Therefore, the scope of application of COORDINSPECTOR is quite large: potentially any piece of code developed in any of the programming languages which compiles to the .Net Framework. The tool generates graphical representations of the coordination layer together and identifies the underlying business process orchestrations, rendering them as Orc specifications
Resumo:
Current software development often relies on non-trivial coordination logic for combining autonomous services, eventually running on different platforms. As a rule, however, such a coordination layer is strongly woven within the application at source code level. Therefore, its precise identification becomes a major methodological (and technical) problem and a challenge to any program understanding or refactoring process. The approach introduced in this paper resorts to slicing techniques to extract coordination data from source code. Such data are captured in a specific dependency graph structure from which a coordination model can be recovered either in the form of an Orc specification or as a collection of code fragments corresponding to the identification of typical coordination patterns in the system. Tool support is also discussed
Resumo:
Forest cover of the Maringá municipality, located in northern Parana State, was mapped in this study. Mapping was carried out by using high-resolution HRC sensor imagery and medium resolution CCD sensor imagery from the CBERS satellite. Images were georeferenced and forest vegetation patches (TOFs - trees outside forests) were classified using two methods of digital classification: reflectance-based or the digital number of each pixel, and object-oriented. The areas of each polygon were calculated, which allowed each polygon to be segregated into size classes. Thematic maps were built from the resulting polygon size classes and summary statistics generated from each size class for each area. It was found that most forest fragments in Maringá were smaller than 500 m². There was also a difference of 58.44% in the amount of vegetation between the high-resolution imagery and medium resolution imagery due to the distinct spatial resolution of the sensors. It was concluded that high-resolution geotechnology is essential to provide reliable information on urban greens and forest cover under highly human-perturbed landscapes.