825 resultados para declarative and procedural knowledge
Resumo:
Woodworking industries still consists of wood dust problems. Young workers are especially vulnerable to safety risks. To reduce risks, it is important to change attitudes and increase knowledge about safety. Safety training have shown to establish positive attitudes towards safety among employees. The aim of current study is to analyze the effect of QR codes that link to Picture Mix EXposure (PIMEX) videos by analyzing attitudes to this safety training method and safety in student responses. Safety training videos were used in upper secondary school handicraft programs to demonstrate wood dust risks and methods to decrease exposure to wood dust. A preliminary study was conducted to investigate improvement of safety training in two schools in preparation for the main study that investigated a safety training method in three schools. In the preliminary study the PIMEX method was first used in which students were filmed while wood dust exposure was measured and subsequently displayed on a computer screen in real time. Before and after the filming, teachers, students, and researchers together analyzed wood dust risks and effective measures to reduce exposure to them. For the main study, QR codes linked to PIMEX videos were attached at wood processing machines. Subsequent interviews showed that this safety training method enables students in an early stage of their life to learn about risks and safety measures to control wood dust exposure. The new combination of methods can create awareness, change attitudes and motivation among students to work more frequently to reduce wood dust.
Can therapy dogs evoke awareness of one's past and present life in persons with Alzheimer's disease?
Resumo:
BACKGROUND: Persons with Alzheimer's disease (AD) sometimes express themselves through behaviours that are difficult to manage for themselves and their caregivers, and to minimise these symptoms alternative methods are recommended. For some time now, animals have been introduced in different ways into the environment of persons with dementia. Animal-Assisted Therapy (AAT) includes prescribed therapy dogs visiting the person with dementia for a specific purpose. AIM: This study aims to illuminate the meaning of the lived experience of encounters with a therapy dog for persons with Alzheimer's disease. METHOD: Video recorded sessions were conducted for each visit of the dog and its handler to a person with AD (10 times/person). The observations have a life-world approach and were transcribed and analysed using a phenomenological hermeneutical approach. RESULTS: The result shows a main theme 'Being aware of one's past and present existence', meaning to connect with one's senses and memories and to reflect upon these with the dog. The time spent with the dog shows the person recounting memories and feelings, and enables an opportunity to reach the person on a cognitive level. CONCLUSIONS: The present study may contribute to health care research and provide knowledge about the use of trained therapy dogs in the care of older persons with AD in a way that might increase quality of life and well-being in persons with dementia. IMPLICATIONS FOR PRACTICE: The study might be useful for caregivers and dog handlers in the care of older persons with dementia.
Resumo:
This study looks at the historical context in which PACs developed, as well as the current legal environment in which they operate. It will also briefly discuss the legal and procedural challenges that candidates face and the ways in which PACs alleviate some of these pressures in ways that presidential committees cannot. An understanding of the strategic dilemmas which cause candidates to seek extraneous structures through which to establish campaign networks is essential to extrapolating the potential future of campaign finance strategy. Furthermore, this study provides an in-depth analysis of the state Commonwealth PACs both in terms of fundraising and spending, and discusses the central issues this state PAC strategy raises with respect to campaign finance law. The study will conclude with a look into the future of campaign financing and the role these state-level PACs may play if current rules are not revised.
Resumo:
Due to the increase in water demand and hydropower energy, it is getting more important to operate hydraulic structures in an efficient manner while sustaining multiple demands. Especially, companies, governmental agencies, consultant offices require effective, practical integrated tools and decision support frameworks to operate reservoirs, cascades of run-of-river plants and related elements such as canals by merging hydrological and reservoir simulation/optimization models with various numerical weather predictions, radar and satellite data. The model performance is highly related with the streamflow forecast, related uncertainty and its consideration in the decision making. While deterministic weather predictions and its corresponding streamflow forecasts directly restrict the manager to single deterministic trajectories, probabilistic forecasts can be a key solution by including uncertainty in flow forecast scenarios for dam operation. The objective of this study is to compare deterministic and probabilistic streamflow forecasts on an earlier developed basin/reservoir model for short term reservoir management. The study is applied to the Yuvacık Reservoir and its upstream basin which is the main water supply of Kocaeli City located in the northwestern part of Turkey. The reservoir represents a typical example by its limited capacity, downstream channel restrictions and high snowmelt potential. Mesoscale Model 5 and Ensemble Prediction System data are used as a main input and the flow forecasts are done for 2012 year using HEC-HMS. Hydrometeorological rule-based reservoir simulation model is accomplished with HEC-ResSim and integrated with forecasts. Since EPS based hydrological model produce a large number of equal probable scenarios, it will indicate how uncertainty spreads in the future. Thus, it will provide risk ranges in terms of spillway discharges and reservoir level for operator when it is compared with deterministic approach. The framework is fully data driven, applicable, useful to the profession and the knowledge can be transferred to other similar reservoir systems.
Resumo:
The general commitments and working requirements of abstract, applied, and art of science, including economics, are assessed. Pure economics deals with the logic of the phenomenon. Positive socio-economics presupposes pure economics and many distinct sciences. Art presupposes socio-economics and direct knowledge of the specificities which characterize the time-space individuality of the phenomenon. This indetermination was partially formulated by Senior and Mill; graduate education in economics is considered in its light. The habit of ignoring it is the Ricardian Vice, as named by Schumpeter; the prevalence of the vice is exemplified, and its causes analyzed.
Resumo:
Este trabalho analisa o desenvolvimento de dynamic capabilities em um contexto de turbulência institucional, diferente das condições em que esta perspectiva teórica costuma ser estudada. É feito um estudo de caso histórico e processual que analisa o surgimento das Dynamic Capabilities nos bancos brasileiros, a partir do desenvolvimento da tecnologia bancária que se deu entre os anos 1960 e 1990. Baseando-se nas proposições da Estratégia que analisam as vantagens competitivas das empresas através de seus recursos, conhecimentos e Dynamic Capabilities, é construído um framework com o qual são analisados diversos depoimentos dados ao livro “Tecnologia bancária no Brasil: uma história de conquistas, uma visão de futuro” (FONSECA; MEIRELLES; DINIZ, 2010) e em entrevistas feitas para este trabalho. Os depoimentos mostram que os bancos fizeram fortes investimentos em tecnologia a partir da reforma financeira de 1964, época em que se iniciou uma sequência de períodos com características próprias do ponto de vista institucional. Conforme as condições mudavam a cada período, os bancos também mudavam seu processo de informatização. No início, os projetos eram executados ad hoc, sob o comando direto dos líderes dos bancos. Com o tempo, à medida que a tecnologia evoluía, a infraestrutura tecnológica crescia e surgiam turbulências institucionais, os bancos progressivamente desenvolveram parcerias entre si e com fornecedores locais, descentralizaram a área de tecnologia, tornaram-se mais flexíveis, fortaleceram a governança corporativa e adotaram uma série de rotinas para cuidar da informática, o que levou ao desenvolvimento gradual das microfundações das Dynamic Capabilties nesses períodos. Em meados dos anos 1990 ocorreram a estabilização institucional e a abertura da economia à concorrência estrangeira, e assim o país colocou-se nas condições que a perspectiva teórica adotada considera ideais para que as Dynamic Capabilities sejam fontes de vantagem competitiva. Os bancos brasileiros mostraram-se preparados para enfrentar essa nova fase, o que é uma evidência de que eles haviam desenvolvido Dynamic Capabilities nas décadas precedentes, sendo que parte desse desenvolvimento podia ser atribuído às turbulências institucionais que eles haviam enfrentado.
Resumo:
The broader objective of this study undertaking can briefly be articulated in particulate aims as follows: to measure the attitudes of consumers regarding the brand displayed by this strategy as well as to highlight recall, recognition and purchase intentions generated by product placement on consumers. In addition, check the differences and similarities between the behavior of Brazilian and American consumers caused by the influence of product placements. The study was undertaken targeting consumer audience in Brazil and the U.S. A rang3 modeling set ups were performed in order to realign study instruments and hypothesis towards the research objectives. This study gave focus on the following hypothesized models. H1: Consumers / Participants who viewed the brands / products in the movie have a higher brand / product recall compared to the consumers / participants who did not view the brands / products in the movie. H2: US Consumers / Participants are able to recognize and recall brands / products which appear in the background of the movie than Brazil. H3: Consumers / participants from USA are more accepting of product placements compared to their counterparts in Brazil. H4: There are discernible similarities in consumer / participant brand attitudes and purchase intentions in consumers / participants from USA and Brazil in spite of the fact that their country of origin is different. Cronbach’s Alpha Coefficient ensured the reliability of survey instruments. The study involved the use of the Structural Equation Modeling (SEM) for the hypothesis testing. This study used the Confirmatory Factor Analysis (CFA) to assess both the convergent and discriminant validities instead of using the Exploratory Factor Analysis (EFA) or the Principal Component Analysis (PCA). This reinforced for the use of the regression Chi Square and T statistical tests in further. Only hypothesis H3 was rejected, the rest were not. T test provided insight findings on specific subgroup significant differences. In the SEM testing, the error variance for product placement attitudes was negative for both the groups. On this The Heywood Case came in handy to fix negative values. The researcher used both quantitative and qualitative approach where closed ended questionnaires and interviews respectively were used to collect primary data. The results were additionally provided with tabulations. It can be concluded that, product placement varies markedly in the U.S. from Brazil based on the influence a range of factors provided in the study. However, there are elements of convergence probably driven by the convergence in technology. In order, product placement to become more competitive in the promotional marketing, there will be the need for researchers to extend focus from the traditional variables and add knowledge on the conventional marketplace factors that is the sell-ability of the product placement technologies and strategies.
Resumo:
Este estudo investiga o poder preditivo fora da amostra, um mês à frente, de um modelo baseado na regra de Taylor para previsão de taxas de câmbio. Revisamos trabalhos relevantes que concluem que modelos macroeconômicos podem explicar a taxa de câmbio de curto prazo. Também apresentamos estudos que são céticos em relação à capacidade de variáveis macroeconômicas preverem as variações cambiais. Para contribuir com o tema, este trabalho apresenta sua própria evidência através da implementação do modelo que demonstrou o melhor resultado preditivo descrito por Molodtsova e Papell (2009), o “symmetric Taylor rule model with heterogeneous coefficients, smoothing, and a constant”. Para isso, utilizamos uma amostra de 14 moedas em relação ao dólar norte-americano que permitiu a geração de previsões mensais fora da amostra de janeiro de 2000 até março de 2014. Assim como o critério adotado por Galimberti e Moura (2012), focamos em países que adotaram o regime de câmbio flutuante e metas de inflação, porém escolhemos moedas de países desenvolvidos e em desenvolvimento. Os resultados da nossa pesquisa corroboram o estudo de Rogoff e Stavrakeva (2008), ao constatar que a conclusão da previsibilidade da taxa de câmbio depende do teste estatístico adotado, sendo necessária a adoção de testes robustos e rigorosos para adequada avaliação do modelo. Após constatar não ser possível afirmar que o modelo implementado provém previsões mais precisas do que as de um passeio aleatório, avaliamos se, pelo menos, o modelo é capaz de gerar previsões “racionais”, ou “consistentes”. Para isso, usamos o arcabouço teórico e instrumental definido e implementado por Cheung e Chinn (1998) e concluímos que as previsões oriundas do modelo de regra de Taylor são “inconsistentes”. Finalmente, realizamos testes de causalidade de Granger com o intuito de verificar se os valores defasados dos retornos previstos pelo modelo estrutural explicam os valores contemporâneos observados. Apuramos que o modelo fundamental é incapaz de antecipar os retornos realizados.
Resumo:
Sharing sensor data between multiple devices and users can be^challenging for naive users, and requires knowledge of programming and use of different communication channels and/or development tools, leading to non uniform solutions. This thesis proposes a system that allows users to access sensors, share sensor data and manage sensors. With this system we intent to manage devices, share sensor data, compare sensor data, and set policies to act based on rules. This thesis presents the design and implementation of the system, as well as three case studies of its use.
Resumo:
A stir bar sorptive extraction with liquid desorption followed by large volume injection coupled to gas chromatography–quadrupole mass spectrometry (SBSE-LD/LVI-GC–qMS) was evaluated for the simultaneous determination of higher alcohol acetates (HAA), isoamyl esters (IsoE) and ethyl esters (EE) of fatty acids. The method performance was assessed and compared with other solventless technique, the solid-phase microextraction (SPME) in headspace mode (HS). For both techniques, influential experimental parameters were optimised to provide sensitive and robust methods. The SBSE-LD/LVI methodology was previously optimised in terms of extraction time, influence of ethanol in the matrix, liquid desorption (LD) conditions and instrumental settings. Higher extraction efficiency was obtained using 60 min of extraction time, 10% ethanol content, n-pentane as desorption solvent, 15 min for the back-extraction period, 10 mL min−1 for the solvent vent flow rate and 10 °C for the inlet temperature. For HS-SPME, the fibre coated with 50/30 μm divinylbenzene/carboxen/polydimethylsiloxane (DVB/CAR/PDMS) afforded highest extraction efficiency, providing the best sensitivity for the target volatiles, particularly when the samples were extracted at 25 °C for 60 min under continuous stirring in the presence of sodium chloride (10% (w/v)). Both methodologies showed good linearity over the concentration range tested, with correlation coefficients higher than 0.984 for HS-SPME and 0.982 for SBES-LD approach, for all analytes. A good reproducibility was attained and low detection limits were achieved using both SBSE-LD (0.03–28.96 μg L−1) and HS-SPME (0.02–20.29 μg L−1) methodologies. The quantification limits for SBSE-LD approach ranging from 0.11 to 96.56 μg L−and from 0.06 to 67.63 μg L−1 for HS-SPME. Using the HS-SPME approach an average recovery of about 70% was obtained whilst by using SBSE-LD obtained average recovery were close to 80%. The analytical and procedural advantages and disadvantages of these two methods have been compared. Both analytical methods were used to determine the HAA, IsoE and EE fatty acids content in “Terras Madeirenses” table wines. A total of 16 esters were identified and quantified from the wine extracts by HS-SPME whereas by SBSE-LD technique were found 25 esters which include 2 higher alcohol acetates, 4 isoamyl esters and 19 ethyl esters of fatty acids. Generally SBSE-LD provided higher sensitivity with decreased analysis time.
Resumo:
This study aims to analyze the implications that the knowledge of an important work for the History of Science, De revolutionibus orbium coelestium , by Nicholas Copernicus, can bring for the formation of Mathematics professors. The study focuses on Book I of Copernicus s work, where, in the final part, is found the Table of the Subtense Straight Lines in a Circle, a true sine table constructed by the author. The study considers two theoretical references, the History of Science and of Mathematics, in the professor s formation searched amongst others in Miguel and Miorm, Brito, Neves and Martins, and Radford, and the necessary teaching knowledge professors mst have, on the basis of Gauthier, Schulman and Imbernón amongst others, through which it is established a net of knowledge grouped in dimensions such as mathematical, psycho pedagogical, cultural and practical diversity, that guide the study analysis. In the search for more necessary elements to enrich the analysis, beyond the theoretical research in Book I, it is carried through, with under graduation pupils, future Math professors, the construction of a sine table following the project used in De revolutionibus . The study still makes a description of the life and work of Nicholas Copernicus, detaching the historical context where the author lived and the conceptions about the Universe existing at that time. The research reveals that the studied work is an important source of culture, able to provide to the Mathematics professor in formation, beyond the conceptual and procedural mathematical knowledge, a cultural knowledge that allows him to be opened to the knowledge of other areas that not his specific area, and so to acquire knowledge about the world history, the development of sciences and of the society
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)