958 resultados para Programação offline
Resumo:
Foi com muita satisfação que recebemos o convite para a publicação nos Cadernos de Sociomuseologia, tanto pelo fato de considerá-lo uma importante publicação na área da Museologia em língua portuguesa como, também, pela razão de termo-nos amparado, inúmeras vezes, nas reflexões de profissionais de Portugal no que diz respeito à temática abordada, ou seja, a programação museológica. Embora este trabalho tenha sido inicialmente apresentado como monografia de conclusão das actividades académicas no âmbito do Curso de Especialização em Museologia do Museu de Arqueologia e Etnologia da Universidade de São Paulo, realizadas entre os anos de 2001 e 2002, sofreu mínimas alterações. A escolha pelo tema deu-se em virtude das inquietações e reflexões profissionais e discentes no sentido de buscar a experimentação de metodologias passíveis de contribuição às necessidades no campo da Museologia dentro da realidade brasileira. Tem, como estudo de caso, um trabalho de consultoria realizado para a Divisão de Iconografia e Museus do Departamento do Património Histórico da Secretaria Municipal de Cultura de São Paulo e da Anhembi Turismo e Eventos da Cidade de São Paulo, que consistia em diagnosticar o acervo pertencente à Anhembi visando à implantação do Centro de Memória do Samba de São Paulo. (Monografia apresentada para conclusão do CEAM do Museu de Arqueologia e Etnologia da Universidade de São Paulo)
Resumo:
O controlo de segurança para preservação da integridade estrutural da barragens é, durante a fase de exploração normal, uma actividade que tem essencialmente como elemento fulcral as inspecções à estrutura e os dados resultantes das observações periódicas da obra, apoiando-se em modelos de comportamento da mesma. Neste sentido, a análise de situações de emergência requer, em regra, a atenção de um especialista em segurança de barragens, o qual poderá, perante os resultados da observação disponíveis e da aplicação de modelos do comportamento da estrutura, identificar o nível de alerta adequado à situação que se está a viver na barragem. Esta abordagem tradicional de controlo de segurança é um processo eficaz mas que apresenta a desvantagem de poder decorrer um período de tempo significativo entre a identificação de um processo anómalo e a definição do respectivo nível de gravidade. O uso de novas tecnologias de apoio à decisão e o planeamento de emergência podem contribuir para minorar os efeitos desta desvantagem. O presente trabalho consiste no desenvolvimento de um modelo de aferição do comportamento de uma barragem através da aplicação de redes neuronais do tipo Perceptrão Multicamadas aos resultados da observação de uma barragem de aterro, por forma a identificar anomalias de comportamento e a quantificar o correspondente nível de alerta. A tese divide-se essencialmente em duas partes. A primeira parte aborda os aspectos que se relacionam com as barragens de aterro, nomeadamente definindo as soluções estruturais mais correntes e identificando os principais tipos de deteriorações que podem surgir nestas estruturas. São, igualmente, abordadas as questões que se relacionam com o controlo de segurança e o planeamento de emergência em barragens de aterro. A segunda parte do trabalho versa sobre o modelo de rede neuronal desenvolvido em linguagem de programação java – o modelo ALBATROZ. Este modelo permite definir o nível de alerta em função do nível de água na albufeira, da pressão registada em quatro piezómetros localizados no corpo e na fundação da barragem e do caudal percolado através da barragem e respectiva fundação. Nesta parte, o trabalho recorre, aos resultados da observação da barragem de Valtorno/Mourão e usa os resultados de um modelo de elementos finitos (desenvolvido no Laboratório Nacional de Engenharia Civil, no âmbito do plano de observação da obra) por forma a simular o comportamento da barragem e fornecer dados para o treino da rede neuronal desenvolvida.O presente trabalho concluiu que o desenvolvimento de redes neuronais que relacionem o valor registado em algumas das grandezas monitorizadas pelo sistema de observação com o nível de alerta associado a uma situação anómala na barragem pode contribuir para a identificação rápida de situações de emergência e permitir agir atempadamente na sua resolução. Esta característica transforma a redes neuronais numa peça importante no planeamento de emergência em barragens e constitui, igualmente, um instrumento de apoio ao controlo de segurança das mesmas.
Resumo:
Não é novidade que o paradigma vigente baseia-se na Internet, em que cada vez mais aplicações mudam o seu modelo de negócio relativamente a licenciamento e manutenção, para passar a oferecer ao utilizador final uma aplicação mais acessível no que concerne a licenciamento e custos de manutenção, já que as aplicações se encontram distribuídas eliminando os custos de capitais e operacionais inerentes a uma arquitetura centralizada. Com a disseminação das Interfaces de Programação de Aplicações (Application Programming Interfaces – API) baseadas na Internet, os programadores passaram a poder desenvolver aplicações que utilizam funcionalidades disponibilizadas por terceiros, sem terem que as programar de raiz. Neste conceito, a API das aplicações Google® permitem a distribuição de aplicações a um mercado muito vasto e a integração com ferramentas de produtividade, sendo uma oportunidade para a difusão de ideias e conceitos. Este trabalho descreve o processo de conceção e implementação de uma plataforma, usando as tecnologias HTML5, Javascript, PHP e MySQL com integração com ®Google Apps, com o objetivo de permitir ao utilizador a preparação de orçamentos, desde o cálculo de preços de custo compostos, preparação dos preços de venda, elaboração do caderno de encargos e respetivo cronograma.
Resumo:
O objetivo desta dissertação procurou perceber como as políticas culturais do Governo Regional dos Açores, através do investimento realizado em equipamentos culturais, influenciou a democratização do acesso à cultura na região, no período entre 1976 e 2008. Com vista a esse objetivo, foram analisados os programas de governos aprovados, seguido do levantamento de dados financeiros relativos a despesa e investimento no setor cultural na região. Foram ainda recolhidos os dados existentes sobre os visitantes dos equipamentos escolhidos, mais especificamente, os museus sob a tutela da administração pública regional, o objeto para este estudo de caso. Após a organização de todos esses dados, foram analisados e construídas hipóteses de comparação entre eles, de modo a resumir a evolução e tendência desses valores. Com vista a perceber os desenvolvimentos no período analisado, procedeu-se à recolha mais completa possível de toda a legislação criada para o setor a nível regional, tornando possível analisar essa consolidação. Após a análise de todos os dados recolhidos e trabalhados, verifica-se que a promoção de medidas com vista a uma maior democratização cultural nos Açores passa por vários fatores: um forte investimento financeiro nos equipamentos em questão (como obras e apetrechamento técnico); o desenvolvimento de legislação estruturante; uma postura de descentralização cultural; contratação de pessoal especializado e formação do pessoal existente; e a criação de uma rede regional de museus. Todas essas ações demonstram o trabalho da administração regional, através da implementação de políticas culturais, com vista a uma maior democratização do acesso à cultura por parte das populações.
Resumo:
O presente artigo resume uma dissertação de Mestrado em Segurança e Higiene do Trabalho. Foi feito um estudo da implementação das Medidas de Autoproteção (MAP) contra incêndio, numa escola de ensino básico e secundário, cujo edifício é isolado, com três pisos e comporta 952 pessoas. Teve como objetivos contribuir para a melhoria dos resultados em situações de emergência, da cultura de segurança e da resiliência. A concretização deste estudo implicou reuniões com o Delegado de Segurança (DS) da instituição, visitas de reconhecimento às instalações, análise de documentação existente, elaboração de documentação auxiliar, programação de ações de sensibilização e de simulação, com envolvimento de meios. A metodologia baseou-se na observação participante, com recurso a gravações de vídeo das atividades desenvolvidas, para posterior análise. No fim do estudo, concluiu-se que os Agentes de Segurança (AS) não estariam, à partida, capacitados para desempenhar as respetivas funções nas MAP. Verificou-se, ainda assim, ser possível desenvolver-lhes algumas competências, mediante informação, formação e treino, que vieram iniciar os AS em matérias de combate ao incêndio, evacuação e primeiros socorros, bem como sensibilizar para as consequências a que se podem expor, para a necessidade de controlo emocional e comunicação eficaz, em situação de emergência. / This article summarizes a master course thesis in Health and Safety at Work. A study was made about the implementation of Measures of Fire Self-Protection (MAP) in a school of basic and secondary education, whose building is isolated, with three floors and accommodates 952 persons. The study aimed to improve results in emergency situations, safety culture and resilience. Such objectives required meetings with the School Safety Officer (DS), reconnaissance visits to facilities, analysis of existing documentation, preparation of auxiliary documentation and awareness-raising actions programming and simulation, as well as the allocation of their resources. The methodology was based on participant observation, with the use of video recordings of activities for later analysis. At the end of the study it was found that the Safety Agents (AS), at the beginning, would not be able to carry out their functions in the MAP. Still, it was found to be possible to develop in them some skills, through information, education and training, which initiated the agents in matters of fire fighting, evacuation and first aid, as well as raised their awareness of the consequences to which they may be exposed, to the need for emotional control and effective communication, in an emergency situation.
Resumo:
Las ventas del formato físico de música se han reducido en Ecuador por falta de control de la piratería y elevados precios de venta al público, provocando la reducción de tiendas discos y el fortalecimiento de la ilegalidad. El lanzamiento de tiendas digitales legales de música en el país representa una oportunidad para las compañías discográficas de impulsar el consumo de música digital, destacando iTunes como la plataforma de mayor difusión y penetración, frente a la cual ninguna compañía ha desarrollado una estrategia. Universal Music es la compañía discográfica número uno en el mundo, y en Ecuador la única con operaciones propias, para la que se propone una estrategia mixta de marketing tradicional y digital, siguiendo una estrategia genérica de enfoque, actual en la empresa, orientada al segmento joven y joven – adulto y estrategias específicas de: penetración de mercado a través del diseño de un plan de comunicación de medios online y offline que promueva la venta del producto digital de la empresa en iTunes e incremente su participación en el mercado musical digital del país, y una estrategia específica de diversificación concéntrica proponiendo la música digital como nuevo formato de calidad, variedad y bajo costo para un mercado que opta por la piratería ante la falta de opciones. El plan de acción considera estrategias y herramientas obtenidas el análisis de las 4 P tradicionales y las 4 nuevas P, desde un enfoque online y offline de product e- marketing, e-promotion, e- communication, e-advertising, ecommerce. La implementación de la estrategia contempla la aplicación del plan mixto de medios propuesto para el nuevo disco del cantante Juanes “Loco de Amor”, con énfasis en la campaña de medios online.
Resumo:
This paper examines to what extent crops and their environment should be viewed as a coupled system. Crop impact assessments currently use climate model output offline to drive process-based crop models. However, in regions where local climate is sensitive to land surface conditions more consistent assessments may be produced with the crop model embedded within the land surface scheme of the climate model. Using a recently developed coupled crop–climate model, the sensitivity of local climate, in particular climate variability, to climatically forced variations in crop growth throughout the tropics is examined by comparing climates simulated with dynamic and prescribed seasonal growth of croplands. Interannual variations in land surface properties associated with variations in crop growth and development were found to have significant impacts on near-surface fluxes and climate; for example, growing season temperature variability was increased by up to 40% by the inclusion of dynamic crops. The impact was greatest in dry years where the response of crop growth to soil moisture deficits enhanced the associated warming via a reduction in evaporation. Parts of the Sahel, India, Brazil, and southern Africa were identified where local climate variability is sensitive to variations in crop growth, and where crop yield is sensitive to variations in surface temperature. Therefore, offline seasonal forecasting methodologies in these regions may underestimate crop yield variability. The inclusion of dynamic crops also altered the mean climate of the humid tropics, highlighting the importance of including dynamical vegetation within climate models.
Resumo:
The Joint UK Land Environmental Simulator (JULES) was run offline to investigate the sensitivity of land surface type changes over South Africa. Sensitivity tests were made in idealised experiments where the actual land surface cover is replaced by a single homogeneous surface type. The vegetation surface types on which some of the experiments were made are static. Experimental tests were evaluated against the control. The model results show among others that the change of the surface cover results in changes of other variables such as soil moisture, albedo, net radiation and etc. These changes are also visible in the spin up process. The model shows different surfaces spinning up at different cycles. Because JULES is the land surface model of Unified Model, the results could be more physically meaningful if it is coupled to the Unified Model.
Resumo:
The paper describes the implementation of an offline, low-cost Brain Computer Interface (BCI) alternative to more expensive commercial models. Using inexpensive general purpose clinical EEG acquisition hardware (Truscan32, Deymed Diagnostic) as the base unit, a synchronisation module was constructed to allow the EEG hardware to be operated precisely in time to allow for recording of automatically time stamped EEG signals. The synchronising module allows the EEG recordings to be aligned in stimulus time locked fashion for further processing by the classifier to establish the class of the stimulus, sample by sample. This allows for the acquisition of signals from the subject’s brain for the goal oriented BCI application based on the oddball paradigm. An appropriate graphical user interface (GUI) was constructed and implemented as the method to elicit the required responses (in this case Event Related Potentials or ERPs) from the subject.
Resumo:
Recent literature has described a “transition zone” between the average top of deep convection in the Tropics and the stratosphere. Here transport across this zone is investigated using an offline trajectory model. Particles were advected by the resolved winds from the European Centre for Medium-Range Weather Forecasts reanalyses. For each boreal winter clusters of particles were released in the upper troposphere over the four main regions of tropical deep convection (Indonesia, central Pacific, South America, and Africa). Most particles remain in the troposphere, descending on average for every cluster. The horizontal components of 5-day trajectories are strongly influenced by the El Niño–Southern Oscillation (ENSO), but the Lagrangian average descent does not have a clear ENSO signature. Tropopause crossing locations are first identified by recording events when trajectories from the same release regions cross the World Meteorological Organization lapse rate tropopause. Most crossing events occur 5–15 days after release, and 30-day trajectories are sufficiently long to estimate crossing number densities. In a further two experiments slight excursions across the lapse rate tropopause are differentiated from the drift deeper into the stratosphere by defining the “tropopause zone” as a layer bounded by the average potential temperature of the lapse rate tropopause and the profile temperature minimum. Transport upward across this zone is studied using forward trajectories released from the lower bound and back trajectories arriving at the upper bound. Histograms of particle potential temperature (θ) show marked differences between the transition zone, where there is a slow spread in θ values about a peak that shifts slowly upward, and the troposphere below 350 K. There forward trajectories experience slow radiative cooling interspersed with bursts of convective heating resulting in a well-mixed distribution. In contrast θ histograms for back trajectories arriving in the stratosphere have two distinct peaks just above 300 and 350 K, indicating the sharp change from rapid convective heating in the well-mixed troposphere to slow ascent in the transition zone. Although trajectories slowly cross the tropopause zone throughout the Tropics, all three experiments show that most trajectories reaching the stratosphere from the lower troposphere within 30 days do so over the west Pacific warm pool. This preferred location moves about 30°–50° farther east in an El Niño year (1982/83) and about 30° farther west in a La Niña year (1988/89). These results could have important implications for upper-troposphere–lower-stratosphere pollution and chemistry studies.
Resumo:
Many weeds occur in patches but farmers frequently spray whole fields to control the weeds in these patches. Given a geo-referenced weed map, technology exists to confine spraying to these patches. Adoption of patch spraying by arable farmers has, however, been negligible partly due to the difficulty of constructing weed maps. Building on previous DEFRA and HGCA projects, this proposal aims to develop and evaluate a machine vision system to automate the weed mapping process. The project thereby addresses the principal technical stumbling block to widespread adoption of site specific weed management (SSWM). The accuracy of weed identification by machine vision based on a single field survey may be inadequate to create herbicide application maps. We therefore propose to test the hypothesis that sufficiently accurate weed maps can be constructed by integrating information from geo-referenced images captured automatically at different times of the year during normal field activities. Accuracy of identification will also be increased by utilising a priori knowledge of weeds present in fields. To prove this concept, images will be captured from arable fields on two farms and processed offline to identify and map the weeds, focussing especially on black-grass, wild oats, barren brome, couch grass and cleavers. As advocated by Lutman et al. (2002), the approach uncouples the weed mapping and treatment processes and builds on the observation that patches of these weeds are quite stable in arable fields. There are three main aspects to the project. 1) Machine vision hardware. Hardware component parts of the system are one or more cameras connected to a single board computer (Concurrent Solutions LLC) and interfaced with an accurate Global Positioning System (GPS) supplied by Patchwork Technology. The camera(s) will take separate measurements for each of the three primary colours of visible light (red, green and blue) in each pixel. The basic proof of concept can be achieved in principle using a single camera system, but in practice systems with more than one camera may need to be installed so that larger fractions of each field can be photographed. Hardware will be reviewed regularly during the project in response to feedback from other work packages and updated as required. 2) Image capture and weed identification software. The machine vision system will be attached to toolbars of farm machinery so that images can be collected during different field operations. Images will be captured at different ground speeds, in different directions and at different crop growth stages as well as in different crop backgrounds. Having captured geo-referenced images in the field, image analysis software will be developed to identify weed species by Murray State and Reading Universities with advice from The Arable Group. A wide range of pattern recognition and in particular Bayesian Networks will be used to advance the state of the art in machine vision-based weed identification and mapping. Weed identification algorithms used by others are inadequate for this project as we intend to collect and correlate images collected at different growth stages. Plants grown for this purpose by Herbiseed will be used in the first instance. In addition, our image capture and analysis system will include plant characteristics such as leaf shape, size, vein structure, colour and textural pattern, some of which are not detectable by other machine vision systems or are omitted by their algorithms. Using such a list of features observable using our machine vision system, we will determine those that can be used to distinguish weed species of interest. 3) Weed mapping. Geo-referenced maps of weeds in arable fields (Reading University and Syngenta) will be produced with advice from The Arable Group and Patchwork Technology. Natural infestations will be mapped in the fields but we will also introduce specimen plants in pots to facilitate more rigorous system evaluation and testing. Manual weed maps of the same fields will be generated by Reading University, Syngenta and Peter Lutman so that the accuracy of automated mapping can be assessed. The principal hypothesis and concept to be tested is that by combining maps from several surveys, a weed map with acceptable accuracy for endusers can be produced. If the concept is proved and can be commercialised, systems could be retrofitted at low cost onto existing farm machinery. The outputs of the weed mapping software would then link with the precision farming options already built into many commercial sprayers, allowing their use for targeted, site-specific herbicide applications. Immediate economic benefits would, therefore, arise directly from reducing herbicide costs. SSWM will also reduce the overall pesticide load on the crop and so may reduce pesticide residues in food and drinking water, and reduce adverse impacts of pesticides on non-target species and beneficials. Farmers may even choose to leave unsprayed some non-injurious, environmentally-beneficial, low density weed infestations. These benefits fit very well with the anticipated legislation emerging in the new EU Thematic Strategy for Pesticides which will encourage more targeted use of pesticides and greater uptake of Integrated Crop (Pest) Management approaches, and also with the requirements of the Water Framework Directive to reduce levels of pesticides in water bodies. The greater precision of weed management offered by SSWM is therefore a key element in preparing arable farming systems for the future, where policy makers and consumers want to minimise pesticide use and the carbon footprint of farming while maintaining food production and security. The mapping technology could also be used on organic farms to identify areas of fields needing mechanical weed control thereby reducing both carbon footprints and also damage to crops by, for example, spring tines. Objective i. To develop a prototype machine vision system for automated image capture during agricultural field operations; ii. To prove the concept that images captured by the machine vision system over a series of field operations can be processed to identify and geo-reference specific weeds in the field; iii. To generate weed maps from the geo-referenced, weed plants/patches identified in objective (ii).
Resumo:
Moist convection is well known to be generally more intense over continental than maritime regions, with larger updraft velocities, graupel, and lightning production. This study explores the transition from maritime to continental convection by comparing the trends in Tropical Rainfall Measuring Mission (TRMM) radar and microwave (37 and 85 GHz) observations over islands of increasing size to those simulated by a cloud-resolving model. The observed storms were essentially maritime over islands of <100 km2 and continental over islands >10 000 km2, with a gradual transition in between. Equivalent radar and microwave quantities were simulated from cloud-resolving runs of the Weather Research and Forecasting model via offline radiation codes. The model configuration was idealized, with islands represented by regions of uniform surface heat flux without orography, using a range of initial sounding conditions without strong horizontal winds or aerosols. Simulated storm strength varied with initial sounding, as expected, but also increased sharply with island size in a manner similar to observations. Stronger simulated storms were associated with higher concentrations of large hydrometeors. Although biases varied with different ice microphysical schemes, the trend was similar for all three schemes tested and was also seen in 2D and 3D model configurations. The successful reproduction of the trend with such idealized forcing supports previous suggestions that mesoscale variation in surface heating—rather than any difference in humidity, aerosol, or other aspects of the atmospheric state—is the main reason that convection is more intense over continents and large islands than over oceans. Some dynamical storm aspects, notably the peak rainfall and minimum surface pressure low, were more sensitive to surface forcing than to the atmospheric sounding or ice scheme. Large hydrometeor concentrations and simulated microwave and radar signatures, however, were at least as sensitive to initial humidity levels as to surface forcing and were more sensitive to the ice scheme. Issues with running the TRMM simulator on 2D simulations are discussed, but they appear to be less serious than sensitivities to model microphysics, which were similar in 2D and 3D. This supports the further use of 2D simulations to economically explore modeling uncertainties.
Resumo:
Purpose - The role of affective states in consumer behaviour is well established. However, no study to date has empirically examined online affective states as a basis for constructing typologies of internet users and for assessing the invariance of clusters across national cultures. Design/methodology/approach - Four focus groups with internet users were carried out to adapt a set of affective states identified from the literature to the online environment. An online survey was then designed to collect data from internet users in four Western and four East Asian countries. Findings - Based on a cluster analysis, six cross-national market segments are identified and labelled "Positive Online Affectivists", "Offline Affectivists", "On/Off-line Negative Affectivists", "Online Affectivists", "Indistinguishable Affectivists", and "Negative Offline Affectivists". The resulting clusters discriminate on the basis of national culture, gender, working status and perceptions towards online brands. Practical implications - Marketers may use this typology to segment internet users in order to predict their perceptions towards online brands. Also, a standardised approach to e-marketing is not recommended on the basis of affective state-based segmentation. Originality/value - This is the first study proposing affective state-based typologies of internet users using comparable samples from four Western and four East Asian countries.
Resumo:
This paper approaches the subject of brand equity measurement on and offline. The existing body of research knowledge on brand equity measurement has derived from classical contexts; however, the majority of today's brands prosper simultaneously online and offline. Since branding on the Web needs to address the unique characteristics of computer-mediated environments, it was posited that classical measures of brand equity were inadequate for this category of brands. Aaker's guidelines for building a brand equity measurement system were thus followed and his brand equity ten was employed as a point of departure. The main challenge was complementing traditional measures of brand equity with new measures pertinent to the Web. Following 16 semi-structured interviews with experts, ten additional measures were identified.
Resumo:
Cannabis sativa has been associated with contradictory effects upon seizure states despite its medicinal use by numerous people with epilepsy. We have recently shown that the phytocannabinoid cannabidiol (CBD) reduces seizure severity and lethality in the well-established in vivo model of pentylenetetrazoleinduced generalised seizures, suggesting that earlier, small-scale clinical trials examining CBD effects in people with epilepsy warrant renewed attention. Here, we report the effects of pure CBD (1, 10 and 100 mg/kg) in two other established rodent seizure models, the acute pilocarpine model of temporal lobe seizure and the penicillin model of partial seizure. Seizure activity was video recorded and scored offline using model-specific seizure severity scales. In the pilocarpine model CBD (all doses) significantly reduced the percentage of animals experiencing the most severe seizures. In the penicillin model, CBD (�10 mg/kg) significantly decreased the percentage mortality as a result of seizures; CBD (all doses) also decreased the percentage of animals experiencing the most severe tonic–clonic seizures. These results extend the anticonvulsant profile of CBD; when combined with a reported absence of psychoactive effects, this evidence strongly supports CBD as a therapeutic candidate for a diverse range of human epilepsies.