929 resultados para Event Study Methodology


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In order to grow, cities are increasingly competing for attention, jobs, investments, visitors, residents and significant events. Cities need to come up with creative solutions to keep up with the competition; they ought to become creative cities. Attracting talented and diverse inhabitants is a key factor in developing a creative city, which on is characterized by openness, tolerance, vibrancy and diversity. Along the need for renewed city images city brand building has become popular. Helsinki is the World Design Capital 2012 (WDC 2012) and this mega-event presents a meaningful opportunity for the city to broadcast itself globally. The purpose of this study is to evaluate how Helsinki brands itself as a creative city through an international mega-event. The sub-aims are to: 1) Map the factors behind the creative city and their relation to the city of Helsinki, 2) Describe the city branding process, 3) Evaluate the role of the Helsinki World Design Capital 2012 mega-event in Helsinki’s creative city brand building. First, the theory discusses the concept of the creative city that has gained growing attention during the past decade. Then, the city branding process is described and the benefits of hosting a mega-event are presented. Finally, co-branding a city and a mega-event in order to generate maximum benefit from the mega-event, is reviewed. This is a qualitative research for which data was collected through three face-to-face interviews, the World Design Capital 2012 bid, Helsinki’s economic development strategy, a consulting firm’s research report on the case city and web-pages. The research reveals that Helsinki has shown interest in the creative city discussion. The terminology around the concept is however approached carefully. Helsinki fits many of the creative city characteristics and recognizes its flaws for which improvement strategies have been planned. Bottlenecks keeping the city from promoting a more open mind were mainly revealed in its organizational structures. Helsinki has no official brand strategy; nonetheless pressure to develop one is present. The World Design Capital 2012 mega-event is seen as a meaningful stepping board to strengthen Helsinki’s identity and image, and start thinking about a city brand. The brand strategies of the mega-event support the values and virtues of the city itself, which enables benefits of co-branding introduces in the theory part. Helsinki has no official brand and doesn’t call itself a creative city, however this study shows signs of the city taking steps towards building a creative city brand with the help of the Helsinki World Design Capital 2012 mega-event.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Vision affords us with the ability to consciously see, and use this information in our behavior. While research has produced a detailed account of the function of the visual system, the neural processes that underlie conscious vision are still debated. One of the aims of the present thesis was to examine the time-course of the neuroelectrical processes that correlate with conscious vision. The second aim was to study the neural basis of unconscious vision, that is, situations where a stimulus that is not consciously perceived nevertheless influences behavior. According to current prevalent models of conscious vision, the activation of visual cortical areas is not, as such, sufficient for consciousness to emerge, although it might be sufficient for unconscious vision. Conscious vision is assumed to require reciprocal communication between cortical areas, but views differ substantially on the extent of this recurrent communication. Visual consciousness has been proposed to emerge from recurrent neural interactions within the visual system, while other models claim that more widespread cortical activation is needed for consciousness. Studies I-III compared models of conscious vision by studying event-related potentials (ERP). ERPs represent the brain’s average electrical response to stimulation. The results support the model that associates conscious vision with activity localized in the ventral visual cortex. The timing of this activity corresponds to an intermediate stage in visual processing. Earlier stages of visual processing may influence what becomes conscious, although these processes do not directly enable visual consciousness. Late processing stages, when more widespread cortical areas are activated, reflect the access to and manipulation of contents of consciousness. Studies IV and V concentrated on unconscious vision. By using transcranial magnetic stimulation (TMS) we show that when early visual cortical processing is disturbed so that subjects fail to consciously perceive visual stimuli, they may nevertheless guess (above chance-level) the location where the visual stimuli were presented. However, the results also suggest that in a similar situation, early visual cortex is necessary for both conscious and unconscious perception of chromatic information (i.e. color). Chromatic information that remains unconscious may influence behavioral responses when activity in visual cortex is not disturbed by TMS. Our results support the view that early stimulus-driven (feedforward) activation may be sufficient for unconscious processing. In conclusion, the results of this thesis support the view that conscious vision is enabled by a series of processing stages. The processes that most closely correlate with conscious vision take place in the ventral visual cortex ~200 ms after stimulus presentation, although preceding time-periods and contributions from other cortical areas such as the parietal cortex are also indispensable. Unconscious vision relies on intact early visual activation, although the location of visual stimulus may be unconsciously resolved even when activity in the early visual cortex is interfered with.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a science metric study of parasites of fish farming in Brazil, including a significant review of the literature. The methodology used was based on researching articles in three different databases, carried out ​on May 2012: ISI (Institute for Scientific Information), SciELO (Scientific Electronic Library Online), and Google Academic. The number of articles on fish parasites is mounting (currently over 110), having much increased since 1995. However, the quantity is still low compared with the amount of papers on parasites of fish from natural environments. In Brazil, the farmed fish that have been studied the most are pacu, tilapia and tambaqui. Monogeneans represent the most prevalent group, followed by protozoa and crustaceans. The regions most researched were the southeast and south, making up 84% of the total literature. The main issue addressed in articles was pathology, followed by treatment and record. In conclusion, the treatment of parasitic diseases of farmed fish in Brazil is still incipient, highlighting the importance and usefulness of management practices to prevent the occurrence of health problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The interconnected domains are attracting interest from industries and academia, although this phenomenon, called ‘convergence’ is not new. Organizational research has indeed focused on uncovering co-creation for manufacturing and the industrial organization, with limited implications to entrepreneurship. Although convergence has been characterized as a process connecting seemingly disparate disciplines, it is argued that these studies tend to leave the creative industries unnoticed. With the art market boom and new forms of collaboration riding past the institution-focused arts marketing literature, this thesis takes a leap to uncover the processes of entrepreneurship in the emergence of a cultural product. As a symbolic work of synergism itself, the thesis combines organizational theory with literature in natural sciences and arts. Assuming nonlinearity, a framework is created for analysing aesthetic experience in an empirical event where network actors are connected to multiple contexts. As the focal case in study, the empirical analysis performed for a music festival organized in a skiing resort in the French Alps in March. The researcher attends the festival and models its cocreation process by enquiring from an artist, festival organisers, and a festival visitor. The findings contribute to fields of entrepreneurship, aesthetics and marketing mainly. It is found that the network actors engage in intimate and creative interaction where activity patterns are interrupted and cultural elements combined. This process is considered to both create and destruct value, through identity building, legitimisation, learning, and access to larger audiences, and it is considered particularly useful for domains where resources are too restrained for conventional marketing practices. This thesis uncovered the role of artists and informants and posits that particularly through experience design, this type of skilled individual be regarded more often as a research informant. Future research is encouraged to engage in convergence by experimenting with different fields and research designs, and it is suggested that future studies could arrive at different descriptive results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes a mathematical and graphical model for face aging. It considers the possibility of predicting the aging process by offering an initial quantification of this process as it applies to the face. It is concerned with physical measurements and a general law of time dependence. After measuring and normalizing a photograph of a person, one could predict, with a known amount of error, the appearance of that person at a different age. The technique described has served its purpose successfully, with a representative amount of patient data behaving sufficiently near the general aging curve of each parameter. That model uses a warping technique to emulate the aging changes on the face of women. Frequently the warping methods are based on the interpolation between images or general mathematical functions to calculate the pixel attributes. The implemented process considers the age features of selected parts of a face such as the face outline and the shape of the lips. These age features were obtained by measuring the facial regions of women that have been photographed throughout their lives. The present work is first concerned with discussing a methodology to define the aging parameters that can be measured, and second with representing the age effects graphically.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work presents recent results concerning a design methodology used to estimate the positioning deviation for a gantry (Cartesian) manipulator, related mainly to structural elastic deformation of components during operational conditions. The case-study manipulator is classified as gantry type and its basic dimensions are 1,53m x 0,97m x 1,38m. The dimensions used for the calculation of effective workspace due to end-effector path displacement are: 1m x 0,5m x 0,5m. The manipulator is composed by four basic modules defined as module X, module Y, module Z and terminal arm, where is connected the end-effector. Each module controlled axis performs a linear-parabolic positioning movement. The planning path algorithm has the maximum velocity and the total distance as input parameters for a given task. The acceleration and deceleration times are the same. Denavit-Hartemberg parameterization method is used in the manipulator kinematics model. The gantry manipulator can be modeled as four rigid bodies with three degrees-of-freedom in translational movements, connected as an open kinematics chain. Dynamic analysis were performed considering inertial parameters specification such as component mass, inertia and center of gravity position of each module. These parameters are essential for a correct manipulator dynamic modelling, due to multiple possibilities of motion and manipulation of objects with different masses. The dynamic analysis consists of a mathematical modelling of the static and dynamic interactions among the modules. The computation of the structural deformations uses the finite element method (FEM).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Industrial applications demand that robots operate in agreement with the position and orientation of their end effector. It is necessary to solve the kinematics inverse problem. This allows the displacement of the joints of the manipulator to be determined, to accomplish a given objective. Complete studies of dynamical control of joint robotics are also necessary. Initially, this article focuses on the implementation of numerical algorithms for the solution of the kinematics inverse problem and the modeling and simulation of dynamic systems. This is done using real time implementation. The modeling and simulation of dynamic systems are performed emphasizing off-line programming. In sequence, a complete study of the control strategies is carried out through the study of several elements of a robotic joint, such as: DC motor, inertia, and gearbox. Finally a trajectory generator, used as input for a generic group of joints, is developed and a proposal of the controller's implementation of joints, using EPLD development system, is presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Few people see both opportunities and threats coming from IT legacy in current world. On one hand, effective legacy management can bring substantial hard savings and smooth transition to the desired future state. On the other hand, its mismanagement contributes to serious operational business risks, as old systems are not as reliable as it is required by the business users. This thesis offers one perspective of dealing with IT legacy – through effective contract management, as a component towards achieving Procurement Excellence in IT, thus bridging IT delivery departments, IT procurement, business units, and suppliers. It developed a model for assessing the impact of improvements on contract management process and set of tools and advices with regards to analysis and improvement actions. The thesis conducted case study to present and justify the implementation of Lean Six Sigma in IT legacy contract management environment. Lean Six Sigma proved to be successful and this thesis presents and discusses all the steps necessary, and pitfalls to avoid, to achieve breakthrough improvement in IT contract management process performance. For the IT legacy contract management process two improvements require special attention and can be easily copied to any organization. First is the issue of diluted contract ownership that stops all the improvements, as people do not know who is responsible for performing those actions. Second is the contract management performance evaluation tool, which can be used for monitoring, identifying outlying contracts and opportunities for improvements in the process. The study resulted in a valuable insight on the benefits of applying Lean Six Sigma to improve IT legacy contract management, as well as on how Lean Six Sigma can be applied in IT environment. Managerial implications are discussed. It is concluded that the use of data-driven Lean Six Sigma methodology for improving the existing IT contract management processes is a significant addition to the existing best practices in contract management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Longitudinal surveys are increasingly used to collect event history data on person-specific processes such as transitions between labour market states. Surveybased event history data pose a number of challenges for statistical analysis. These challenges include survey errors due to sampling, non-response, attrition and measurement. This study deals with non-response, attrition and measurement errors in event history data and the bias caused by them in event history analysis. The study also discusses some choices faced by a researcher using longitudinal survey data for event history analysis and demonstrates their effects. These choices include, whether a design-based or a model-based approach is taken, which subset of data to use and, if a design-based approach is taken, which weights to use. The study takes advantage of the possibility to use combined longitudinal survey register data. The Finnish subset of European Community Household Panel (FI ECHP) survey for waves 1–5 were linked at person-level with longitudinal register data. Unemployment spells were used as study variables of interest. Lastly, a simulation study was conducted in order to assess the statistical properties of the Inverse Probability of Censoring Weighting (IPCW) method in a survey data context. The study shows how combined longitudinal survey register data can be used to analyse and compare the non-response and attrition processes, test the missingness mechanism type and estimate the size of bias due to non-response and attrition. In our empirical analysis, initial non-response turned out to be a more important source of bias than attrition. Reported unemployment spells were subject to seam effects, omissions, and, to a lesser extent, overreporting. The use of proxy interviews tended to cause spell omissions. An often-ignored phenomenon classification error in reported spell outcomes, was also found in the data. Neither the Missing At Random (MAR) assumption about non-response and attrition mechanisms, nor the classical assumptions about measurement errors, turned out to be valid. Both measurement errors in spell durations and spell outcomes were found to cause bias in estimates from event history models. Low measurement accuracy affected the estimates of baseline hazard most. The design-based estimates based on data from respondents to all waves of interest and weighted by the last wave weights displayed the largest bias. Using all the available data, including the spells by attriters until the time of attrition, helped to reduce attrition bias. Lastly, the simulation study showed that the IPCW correction to design weights reduces bias due to dependent censoring in design-based Kaplan-Meier and Cox proportional hazard model estimators. The study discusses implications of the results for survey organisations collecting event history data, researchers using surveys for event history analysis, and researchers who develop methods to correct for non-sampling biases in event history data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Harm Avoidance and Neuroticism are traits that predispose to mental illnesses. Studying them provides a unique way to study predisposition of mental illnesses. Understanding the biological mechanisms that mediate vulnerability could lead to improvement in treatment and ultimately to pre-emptive psychiatry. These personality traits describe a tendency to feel negative emotions such as fear, shyness and worry. Previous studies suggest these traits are regulated by serotonin and opiate pathways. The aim of this thesis was to test the following hypotheses using personality trait measures and positron emission tomography (PET): 1) Brain serotonin transporter density in vivo is associated with Harm Avoidance and Neuroticism traits. 2) μ-opioid receptor binding is associated with Harm Avoidance. In addition, we developed a methodology for studying neurotransmitter interactions in the brain using the opiate and serotonin pathways. 32 healthy subjects who were consistently in either the highest or lowest quartile of the Harm Avoidance trait were recruited from a population-based cohort. Each subject underwent two PET scans, serotonin transporter binding was measured with [11C] MADAM and μ-opioid receptor binding with [11C]carfentanil. We found that the serotonin transporter is not associated with anxious personality traits. However, Harm Avoidance positively correlated with μ-opioid receptor availability. Particularly the tendency to feel shy and the inability to cope with stress were associated μ-opioid receptor availability. We also demonstrated that serotonin transporter binding correlates with μ-opioid receptor binding, suggesting interplay between the two systems. These findings shed light on the neurobiological correlates of personality and have an impact on etiological considerations of affective disorders.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

More discussion is required on how and which types of biomass should be used to achieve a significant reduction in the carbon load released into the atmosphere in the short term. The energy sector is one of the largest greenhouse gas (GHG) emitters and thus its role in climate change mitigation is important. Replacing fossil fuels with biomass has been a simple way to reduce carbon emissions because the carbon bonded to biomass is considered as carbon neutral. With this in mind, this thesis has the following objectives: (1) to study the significance of the different GHG emission sources related to energy production from peat and biomass, (2) to explore opportunities to develop more climate friendly biomass energy options and (3) to discuss the importance of biogenic emissions of biomass systems. The discussion on biogenic carbon and other GHG emissions comprises four case studies of which two consider peat utilization, one forest biomass and one cultivated biomasses. Various different biomass types (peat, pine logs and forest residues, palm oil, rapeseed oil and jatropha oil) are used as examples to demonstrate the importance of biogenic carbon to life cycle GHG emissions. The biogenic carbon emissions of biomass are defined as the difference in the carbon stock between the utilization and the non-utilization scenarios of biomass. Forestry-drained peatlands were studied by using the high emission values of the peatland types in question to discuss the emission reduction potential of the peatlands. The results are presented in terms of global warming potential (GWP) values. Based on the results, the climate impact of the peat production can be reduced by selecting high-emission-level peatlands for peat production. The comparison of the two different types of forest biomass in integrated ethanol production in pulp mill shows that the type of forest biomass impacts the biogenic carbon emissions of biofuel production. The assessment of cultivated biomasses demonstrates that several selections made in the production chain significantly affect the GHG emissions of biofuels. The emissions caused by biofuel can exceed the emissions from fossil-based fuels in the short term if biomass is in part consumed in the process itself and does not end up in the final product. Including biogenic carbon and other land use carbon emissions into the carbon footprint calculations of biofuel reveals the importance of the time frame and of the efficiency of biomass carbon content utilization. As regards the climate impact of biomass energy use, the net impact on carbon stocks (in organic matter of soils and biomass), compared to the impact of the replaced energy source, is the key issue. Promoting renewable biomass regardless of biogenic GHG emissions can increase GHG emissions in the short term and also possibly in the long term.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the past decades, companies’ interest in controlling indirect sourcing process has increased. New indirect procurement strategies developed for the companies are needed in order to manage their indirect costs. New cost management strategies allow companies to improve their core competences. The research methodology and used in this thesis is qualitative. The theory is based on scientific publications. Empirical data given by the case organization, and was collected in the company’s own systems and in project steering meetings. The purpose of the study was to select a new electronic system for the company and give options for the company to reduce case organization’s indirect costs. The result showed that the most effective indirect cost management strategy was adopting a new electronic procurement system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Laser additive manufacturing (LAM), known also as 3D printing, is a powder bed fusion (PBF) type of additive manufacturing (AM) technology used to manufacture metal parts layer by layer by assist of laser beam. The development of the technology from building just prototype parts to functional parts is due to design flexibility. And also possibility to manufacture tailored and optimised components in terms of performance and strength to weight ratio of final parts. The study of energy and raw material consumption in LAM is essential as it might facilitate the adoption and usage of the technique in manufacturing industries. The objective this thesis was find the impact of LAM on environmental and economic aspects and to conduct life cycle inventory of CNC machining and LAM in terms of energy and raw material consumption at production phases. Literature overview in this thesis include sustainability issues in manufacturing industries with focus on environmental and economic aspects. Also life cycle assessment and its applicability in manufacturing industry were studied. UPLCI-CO2PE! Initiative was identified as mostly applied exiting methodology to conduct LCI analysis in discrete manufacturing process like LAM. Many of the reviewed literature had focused to PBF of polymeric material and only few had considered metallic materials. The studies that had included metallic materials had only measured input and output energy or materials of the process and compared to different AM systems without comparing to any competitive process. Neither did any include effect of process variation when building metallic parts with LAM. Experimental testing were carried out to make dissimilar samples with CNC machining and LAM in this thesis. Test samples were designed to include part complexity and weight reductions. PUMA 2500Y lathe machine was used in the CNC machining whereas a modified research machine representing EOSINT M-series was used for the LAM. The raw material used for making the test pieces were stainless steel 316L bar (CNC machined parts) and stainless steel 316L powder (LAM built parts). An analysis of power, time, and the energy consumed in each of the manufacturing processes on production phase showed that LAM utilises more energy than CNC machining. The high energy consumption was as result of duration of production. Energy consumption profiles in CNC machining showed fluctuations with high and low power ranges. LAM energy usage within specific mode (standby, heating, process, sawing) remained relatively constant through the production. CNC machining was limited in terms of manufacturing freedom as it was not possible to manufacture all the designed sample by machining. And the one which was possible was aided with large amount of material removed as waste. Planning phase in LAM was shorter than in CNC machining as the latter required many preparation steps. Specific energy consumption (SEC) were estimated in LAM based on the practical results and assumed platform utilisation. The estimated platform utilisation showed SEC could reduce when more parts were placed in one build than it was in with the empirical results in this thesis (six parts).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Successful management of rivers requires an understanding of the fluvial processes that govern them. This, in turn cannot be achieved without a means of quantifying their geomorphology and hydrology and the spatio-temporal interactions between them, that is, their hydromorphology. For a long time, it has been laborious and time-consuming to measure river topography, especially in the submerged part of the channel. The measurement of the flow field has been challenging as well, and hence, such measurements have long been sparse in natural environments. Technological advancements in the field of remote sensing in the recent years have opened up new possibilities for capturing synoptic information on river environments. This thesis presents new developments in fluvial remote sensing of both topography and water flow. A set of close-range remote sensing methods is employed to eventually construct a high-resolution unified empirical hydromorphological model, that is, river channel and floodplain topography and three-dimensional areal flow field. Empirical as well as hydraulic theory-based optical remote sensing methods are tested and evaluated using normal colour aerial photographs and sonar calibration and reference measurements on a rocky-bed sub-Arctic river. The empirical optical bathymetry model is developed further by the introduction of a deep-water radiance parameter estimation algorithm that extends the field of application of the model to shallow streams. The effect of this parameter on the model is also assessed in a study of a sandy-bed sub-Arctic river using close-range high-resolution aerial photography, presenting one of the first examples of fluvial bathymetry modelling from unmanned aerial vehicles (UAV). Further close-range remote sensing methods are added to complete the topography integrating the river bed with the floodplain to create a seamless high-resolution topography. Boat- cart- and backpack-based mobile laser scanning (MLS) are used to measure the topography of the dry part of the channel at a high resolution and accuracy. Multitemporal MLS is evaluated along with UAV-based photogrammetry against terrestrial laser scanning reference data and merged with UAV-based bathymetry to create a two-year series of seamless digital terrain models. These allow the evaluation of the methodology for conducting high-resolution change analysis of the entire channel. The remote sensing based model of hydromorphology is completed by a new methodology for mapping the flow field in 3D. An acoustic Doppler current profiler (ADCP) is deployed on a remote-controlled boat with a survey-grade global navigation satellite system (GNSS) receiver, allowing the positioning of the areally sampled 3D flow vectors in 3D space as a point cloud and its interpolation into a 3D matrix allows a quantitative volumetric flow analysis. Multitemporal areal 3D flow field data show the evolution of the flow field during a snow-melt flood event. The combination of the underwater and dry topography with the flow field yields a compete model of river hydromorphology at the reach scale.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An interesting fact about language cognition is that stimulation involving incongruence in the merge operation between verb and complement has often been related to a negative event-related potential (ERP) of augmented amplitude and latency of ca. 400 ms - the N400. Using an automatic ERP latency and amplitude estimator to facilitate the recognition of waves with a low signal-to-noise ratio, the objective of the present study was to study the N400 statistically in 24 volunteers. Stimulation consisted of 80 experimental sentences (40 congruous and 40 incongruous), generated in Brazilian Portuguese, involving two distinct local verb-argument combinations (nominal object and pronominal object series). For each volunteer, the EEG was simultaneously acquired at 20 derivations, topographically localized according to the 10-20 International System. A computerized routine for automatic N400-peak marking (based on the ascendant zero-cross of the first waveform derivative) was applied to the estimated individual ERP waveform for congruous and incongruous sentences in both series for all ERP topographic derivations. Peak-to-peak N400 amplitude was significantly augmented (P < 0.05; one-sided Wilcoxon signed-rank test) due to incongruence in derivations F3, T3, C3, Cz, T5, P3, Pz, and P4 for nominal object series and in P3, Pz and P4 for pronominal object series. The results also indicated high inter-individual variability in ERP waveforms, suggesting that the usual procedure of grand averaging might not be considered a generally adequate approach. Hence, signal processing statistical techniques should be applied in neurolinguistic ERP studies allowing waveform analysis with low signal-to-noise ratio.