924 resultados para Scenario Programming, Markup Language, End User Programming
Resumo:
1. Establishing biological control agents in the field is a major step in any classical biocontrol programme, yet there are few general guidelines to help the practitioner decide what factors might enhance the establishment of such agents. 2. A stochastic dynamic programming (SDP) approach, linked to a metapopulation model, was used to find optimal release strategies (number and size of releases), given constraints on time and the number of biocontrol agents available. By modelling within a decision-making framework we derived rules of thumb that will enable biocontrol workers to choose between management options, depending on the current state of the system. 3. When there are few well-established sites, making a few large releases is the optimal strategy. For other states of the system, the optimal strategy ranges from a few large releases, through a mixed strategy (a variety of release sizes), to many small releases, as the probability of establishment of smaller inocula increases. 4. Given that the probability of establishment is rarely a known entity, we also strongly recommend a mixed strategy in the early stages of a release programme, to accelerate learning and improve the chances of finding the optimal approach.
Resumo:
1. A model of the population dynamics of Banksia ornata was developed, using stochastic dynamic programming (a state-dependent decision-making tool), to determine optimal fire management strategies that incorporate trade-offs between biodiversity conservation and fuel reduction. 2. The modelled population of B. ornata was described by its age and density, and was exposed to the risk of unplanned fires and stochastic variation in germination success. 3. For a given population in each year, three management strategies were considered: (i) lighting a prescribed fire; (ii) controlling the incidence of unplanned fire; (iii) doing nothing. 4. The optimal management strategy depended on the state of the B. ornata population, with the time since the last fire (age of the population) being the most important variable. Lighting a prescribed fire at an age of less than 30 years was only optimal when the density of seedlings after a fire was low (< 100 plants ha(-1)) or when there were benefits of maintaining a low fuel load by using more frequent fire. 5. Because the cost of management was assumed to be negligible (relative to the value of the persistence of the population), the do-nothing option was never the optimal strategy, although lighting prescribed fires had only marginal benefits when the mean interval between unplanned fires was less than 20-30 years.
Resumo:
A chance constrained programming model is developed to assist Queensland barley growers make varietal and agronomic decisions in the face of changing product demands and volatile production conditions. Unsuitable or overlooked in many risk programming applications, the chance constrained programming approach nonetheless aptly captures the single-stage decision problem faced by barley growers of whether to plant lower-yielding but potentially higher-priced malting varieties, given a particular expectation of meeting malting grade standards. Different expectations greatly affect the optimal mix of malting and feed barley activities. The analysis highlights the suitability of chance constrained programming to this specific class of farm decision problem.
Resumo:
A modelling framework is developed to determine the joint economic and environmental net benefits of alternative land allocation strategies. Estimates of community preferences for preservation of natural land, derived from a choice modelling study, are used as input to a model of agricultural production in an optimisation framework. The trade-offs between agricultural production and environmental protection are analysed using the sugar industry of the Herbert River district of north Queensland as an example. Spatially-differentiated resource attributes and the opportunity costs of natural land determine the optimal tradeoffs between production and conservation for a range of sugar prices.
Resumo:
Taking functional programming to its extremities in search of simplicity still requires integration with other development (e.g. formal) methods. Induction is the key to deriving and verifying functional programs, but can be simplified through packaging proofs with functions, particularly folds, on data (structures). Totally Functional Programming avoids the complexities of interpretation by directly representing data (structures) as platonic combinators - the functions characteristic to the data. The link between the two simplifications is that platonic combinators are a kind of partially-applied fold, which means that platonic combinators inherit fold-theoretic properties, but with some apparent simplifications due to the platonic combinator representation. However, despite observable behaviour within functional programming that suggests that TFP is widely-applicable, significant work remains before TFP as such could be widely adopted.
Resumo:
Although planning is important for the functioning of patients with dementia of the Alzheimer Type (DAT), little is known about response programming in DAT. This study used a cueing paradigm coupled with quantitative kinematic analysis to document the preparation and execution of movements made by a group of 12 DAT patients and their age and sex matched controls. Participants connected a series of targets placed upon a WACOM SD420 graphics tablet, in response to the pattern of illumination of a set of light emitting diodes (LEDs). In one condition, participants could programme the upcoming movement, whilst in another they were forced to reprogramme this movement on-line (i.e. they were not provided with advance information about the location of the upcoming target). DAT patients were found to have programming deficits, taking longer to initiate movements; particularly in the absence of cues. While problems spontaneously programming a movement might cause a greater reliance upon on-line guidance, when both groups were required to guide the movement on-line, DAT patients continued to show slower and less efficient movements implying declining sensori-motor function; these differences were not simply due to strategy or medication status. (C) 1997 Elsevier Science Ltd.
Resumo:
Substance-dependence is highly associated with executive cognitive function (ECF) impairments. However. considering that it is difficult to assess ECF clinically, the aim of the present study was to examine the feasibility of a brief neuropsychological tool (the Frontal Assessment Battery FAB) to detect specific ECF impairments in a sample of substance-dependent individuals (SDI). Sixty-two subjects participated in this study. Thirty DSM-IV-diagnosed SDI, after 2 weeks of abstinence, and 32 healthy individuals (control group) were evaluated with FAD and other ECF-related tasks: digits forward (DF), digits backward (DB), Stroop Color Word Test (SCWT), and Wisconsin Card Sorting Test (WCST). SDI did not differ from the control group on sociodemographic variables or IQ. However, SDI performed below the controls in OF, DB, and FAB. The SDI were cognitively impaired in 3 of the 6 cognitive domains assessed by the FAB: abstract reasoning, motor programming, and cognitive flexibility. The FAB correlated with DF, SCWT, and WCST. In addition, some neuropsychological measures were correlated with the amount of alcohol, cannabis, and cocaine use. In conclusion, SDI performed more poorly than the comparison group on the FAB and the FAB`s results were associated with other ECF-related tasks. The results suggested a negative impact of alcohol, cannabis, and cocaine use on the ECF. The FAB may be useful in assisting professionals as an instrument to screen for ECF-related deficits in SDI. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
Neospora caninum is an apicomplexan parasite responsible for major economic losses due to abortions in cattle. Toll-like receptors (TLRs) sense specific microbial products and direct downstream signaling pathways in immune cells, linking innate, and adaptive immunity. Here, we analyze the role of TLR2 on innate and adaptive immune responses during N. caninum infection. Inflammatory peritoneal macrophages and bone marrow-derived dendritic cells exposed to N. caninum-soluble antigens presented an upregulated expression of TLR2. Increased receptor expression was correlated to TLR2/MyD88-dependent antigen-presenting cell maturation and pro-inflammatory cytokine production after stimulation by antigens. Impaired innate responses observed after infection of mice genetically deficient for TLR2((-/-)) was followed by downregulation of adaptive T helper 1 (Th1) immunity, represented by diminished parasite-specific CD4(+) and CD8(+) T-cell proliferation, IFN-gamma:interleukin (IL)-10 ratio, and IgG subclass synthesis. In parallel, TLR2(-/-) mice presented higher parasite burden than wild-type (WT) mice at acute and chronic stages of infection. These results show that initial recognition of N. caninum by TLR2 participates in the generation of effector immune responses against N. caninum and imply that the receptor may be a target for future prophylactic strategies against neosporosis. Immunology and Cell Biology (2010) 88, 825-833; doi:10.1038/icb.2010.52; published online 20 April 2010
Resumo:
A decision theory framework can be a powerful technique to derive optimal management decisions for endangered species. We built a spatially realistic stochastic metapopulation model for the Mount Lofty Ranges Southern Emu-wren (Stipiturus malachurus intermedius), a critically endangered Australian bird. Using diserete-time Markov,chains to describe the dynamics of a metapopulation and stochastic dynamic programming (SDP) to find optimal solutions, we evaluated the following different management decisions: enlarging existing patches, linking patches via corridors, and creating a new patch. This is the first application of SDP to optimal landscape reconstruction and one of the few times that landscape reconstruction dynamics have been integrated with population dynamics. SDP is a powerful tool that has advantages over standard Monte Carlo simulation methods because it can give the exact optimal strategy for every landscape configuration (combination of patch areas and presence of corridors) and pattern of metapopulation occupancy, as well as a trajectory of strategies. It is useful when a sequence of management actions can be performed over a given time horizon, as is the case for many endangered species recovery programs, where only fixed amounts of resources are available in each time step. However, it is generally limited by computational constraints to rather small networks of patches. The model shows that optimal metapopulation, management decisions depend greatly on the current state of the metapopulation,. and there is no strategy that is universally the best. The extinction probability over 30 yr for the optimal state-dependent management actions is 50-80% better than no management, whereas the best fixed state-independent sets of strategies are only 30% better than no management. This highlights the advantages of using a decision theory tool to investigate conservation strategies for metapopulations. It is clear from these results that the sequence of management actions is critical, and this can only be effectively derived from stochastic dynamic programming. The model illustrates the underlying difficulty in determining simple rules of thumb for the sequence of management actions for a metapopulation. This use of a decision theory framework extends the capacity of population viability analysis (PVA) to manage threatened species.
Resumo:
O intuito inicial desta pesquisa foi acompanhar processos de trabalho à luz do referencial teórico da Ergologia e, portanto, concebendo o trabalho como relação dialética entre técnica e ação humana. O objetivo era cartografar o trabalho no processo de beneficiamento de granitos em uma organização de grande porte localizada no Espírito Santo e, após algum tempo em campo, o problema delineou- se do seguinte modo: como se constitui a competência industriosa no beneficiamento de granitos em uma organização de grande porte? A pesquisa justifica-se uma vez que, a despeito da relevância econômica, o cenário capixaba de rochas ornamentais apresenta problemas precários no que diz respeito à gestão. Para os Estudos Organizacionais, a relevância é reforçada pelo fato de aproximar desta área a abordagem ergológica e demarcar no debate sobre competência a noção de competência industriosa, ainda não explorada nesse campo de estudo. Para realização da pesquisa, foi praticada uma cartografia ergológica, a partir da articulação das pistas cartográficas com o referencial teórico-conceitual da Ergologia, sendo utilizadas como técnicas: observação participante durante 6 meses, com uma média de 3 visitas a campo por semana; 8 entrevistas semiestruturadas e em profundidade de cerca de 50 minutos cada com trabalhadores operacionais; uma entrevista com gerente de produção e outra com representante da área de Gestão de Pessoas; conversas com os demais trabalhadores, a fim de enriquecer o diário de campo; novas conversas e observações ao final da análise, para confrontação-validação com os trabalhadores. A sistematização dos procedimentos de análise pode ser assim descrita: a) leituras flutuantes com objetivo de fazer emergirem aspectos centrais relacionados às duas dimensões do trabalho, técnica e ação humana; b) leituras em profundidade com objetivo de fazer emergirem singularidades e especificidades relativas à dialética entre ambas; c) leituras em profundidade com objetivo de fazer emergirem aspectos relativos aos ingredientes da competência industriosa. A despeito da não delimitação de categorias analíticas e subcategorias, a partir da análise emergiram cinco eixos analíticos: 1) os procedimentos a serem empregados no processo de beneficiamento de granitos, englobando: as etapas do beneficiamento; as funções a serem desempenhadas e as tarefas a serem desenvolvidas; as normas regulamentadoras; os conhecimentos técnicos necessários para programação e operação de máquinas; a ordem de produção prescrita pelo setor comercial; 2) o trabalho real, diferenciado do trabalho como emprego de procedimentos pelo foco dado à ação humana no enfrentamento de situações reais, repletas de eventos e variabilidades, em todo o processo, englobando: preparo de carga; laminação; serrada; levigamento; resinagem; polimento-classificação; retoque; fechamento de pacote; ovada de contêiner; 3) diferentes modos de usos de si que, em tendência, são responsáveis pela constituição do agir em competência em cada etapa do processo, na dialética entre técnica e ação humana; 4) o modo como cada ingrediente da competência industriosa atua e se constitui, bem como sua concentração, em tendência, em cada etapa do processo, a partir dos tipos de usos de si que, também em tendência, são mais responsáveis pelo agir em competência, apresentando assim o perfil da competência industriosa no beneficiamento de granitos na empresa em análise; 5) dois possíveis fatores potencializadores dos ingredientes da competência industriosa, a saber, a transdução e os não-humanos. A partir de todo o exposto, as últimas considerações problematizam aspectos relativos ao debate sobre competências e práticas de gestão de pessoas a partir da competência compreendida da seguinte forma: mestria no ato de tirar partido do meio e de si para gerir situações de trabalho, em que a ação consiste na mobilização de recursos dificilmente perceptíveis e descritíveis, inerentes ao trabalhador, porém constituídos e manifestos por usos de si por si e pelos outros no e para o ato real de trabalho, marcadamente num nível infinitesimal, diante de situações que demandam aplicação de protocolos concomitante à gestão de variabilidades e eventos em parte inantecipáveis e inelimináveis.
Resumo:
O presente trabalho teve como principal objectivo o desenvolvimento de um analisador de vibrações de dois canais baseado em computador, para a realização de diagnóstico no âmbito do controlo de condição de máquinas. Foi desenvolvida uma aplicação num computador comum, no software LabVIEW, que através de transdutores de aceleração do tipo MEMS conectados via USB, faz a recolha de dados de vibração e procede ao seu processamento e apresentação ao utilizador. As ferramentas utilizadas para o processamento de dados são ferramentas comuns encontradas em vários analisadores de vibrações disponíveis no mercado. Estas podem ser: gráficos de espectro de frequência, sinal no tempo, cascata ou valores de nível global de vibração, entre outras. Apesar do analisador desenvolvido não apresentar inovação nas ferramentas de análise adoptadas, este pretende ser distinguido pelo baixo custo, simplicidade e carácter didáctico. Este trabalho vem evidenciar as vantagens, desvantagens e potencialidades de um analisador desta natureza. São tiradas algumas conclusões quanto à sua capacidade de diagnóstico de avarias, capacidades como ferramenta didáctica, sensores utilizados e linguagem de programação escolhida. Como conclusões principais, o trabalho revela que os sensores escolhidos não são os indicados para efectuar o diagnóstico de avarias em ambiente industrial, contudo são ideais para tornar este analisador numa boa ferramenta didáctica e de treino.