999 resultados para Moist environment
Resumo:
Tämä tutkielma käsittelee lisäarvon syntymistä, ylläpitämistä ja hallintaa verkostoi-tuneessa tuotekehitysympäristössä. Teemahaastattelu-menetelmää käyttäen, tavoitteena on tunnistaa ja kuvata ne prosessit, käytännöt ja toimintatavat, joissa kohdeyritys on onnistunut ja joissa lisäarvoa on syntynyt. Toinen keskeinen tavoite on löytää ongelmalliset alueet lisäarvon tuottamisessa ja analysoida, miksi nämä alueet ovat ongelmallisia. Käsitteiden arvo, arvoketju ja arvoverkosto, sekä viitekirjallisuuden esimerkkien perusteella muodostetaan teoreettinen viitekehys ja kuvataan niitä hyödyllisiä toimintatapoja ja käytäntöjä, joihin panostamalla lisäarvoa syntyy. Erityisesti informaatioteknologian alalla verkostoituminen ja arvoverkosto ovat yhä merkittävämpiä tuotekehityksen toimintatapoja, mihin horisontaalisen yhteistyön kehittyminen, globalisoituminen ja informaatioteknologian nopea kehitys on johtanut. Keskeisiä tuloksia ovat tarve yhtenäisempään, prosessinomaisempaan toimintatapaan ja liiketoimintaprosessien muokkaamiseen verkostoituneen T&K ympäristön vaatimusten mukaisesti. Myös tarve paremman näkyvyyden luomiseen sekä aktiviteettien hallintaan uudentyyppisen arvoverkoston vaatimusten mukaisesti korostui tuloksissa.
Resumo:
From 6 to 8 November 1982 one of the most catastrophic flash-flood events was recorded in the Eastern Pyrenees affecting Andorra and also France and Spain with rainfall accumulations exceeding 400 mm in 24 h, 44 fatalities and widespread damage. This paper aims to exhaustively document this heavy precipitation event and examines mesoscale simulations performed by the French Meso-NH non-hydrostatic atmospheric model. Large-scale simulations show the slow-evolving synoptic environment favourable for the development of a deep Atlantic cyclone which induced a strong southerly flow over the Eastern Pyrenees. From the evolution of the synoptic pattern four distinct phases have been identified during the event. The mesoscale analysis presents the second and the third phase as the most intense in terms of rainfall accumulations and highlights the interaction of the moist and conditionally unstable flows with the mountains. The presence of a SW low level jet (30 m s-1) around 1500 m also had a crucial role on focusing the precipitation over the exposed south slopes of the Eastern Pyrenees. Backward trajectories based on Eulerian on-line passive tracers indicate that the orographic uplift was the main forcing mechanism which triggered and maintained the precipitating systems more than 30 h over the Pyrenees. The moisture of the feeding flow mainly came from the Atlantic Ocean (7-9 g kg-1) and the role of the Mediterranean as a local moisture source was very limited (2-3 g kg-1) due to the high initial water vapour content of the parcels and the rapid passage over the basin along the Spanish Mediterranean coast (less than 12 h).
Resumo:
Plants must constantly adapt to a changing light environment in order to optimize energy conversion through the process of photosynthesis and to limit photodamage. In addition, plants use light cues for timing of key developmental transitions such as initiation of reproduction (transition to flowering). Plants are equipped with a battery of photoreceptors enabling them to sense a very broad light spectrum spanning from UV-B to far-red wavelength (280-750nm). In this review we briefly describe the different families of plant photosensory receptors and the mechanisms by which they transduce environmental information to influence numerous aspects of plant growth and development throughout their life cycle.
Resumo:
Participants in an immersive virtual environment interact with the scene from an egocentric point of view that is, where there bodies appear to be located rather than from outside as if looking through a window. People interact through normal body movements, such as head-turning,reaching, and bending, and within the tracking limitations move through the environment or effect changes within it in natural ways.
Resumo:
Does realistic lighting in an immersive virtual reality application enhance presence, where participants feel that they are in the scene and behave correspondingly? Our previous study indicated that presence is more likely with real-time ray tracing compared with ray casting, but we could not separate the effects of overall quality of illumination from the dynamic effects of real-time shadows and reflections. Here we describe an experiment where 20 people experienced a scene rendered with global or local illumination. However, in both conditions there were dynamically changing shadows and reflections. We found that the quality of illumination did not impact presence, so that the earlier result must have been due to dynamic shadows and reflections. However, global illumination resulted in greater plausibility - participants were more likely to respond as if the virtual events were real. We conclude that global illumination does impact the responses of participants and is worth the effort.
Resumo:
Thegoalofthepresentreviewistoexplainhowimmersivevirtualenvironmenttechnology(IVET)canbeusedforthestudyofsocialinteractionsandhowtheuseofvirtualhumansinimmersivevirtualenvironmentscanadvanceresearchandapplicationinmanydifferentfields.Researchersstudyingindividualdifferencesinsocialinteractionsaretypicallyinterestedinkeepingthebehaviorandtheappearanceoftheinteractionpartnerconstantacrossparticipants.WithIVETresearchershavefullcontrolovertheinteractionpartners,canstandardizethemwhilestillkeepingthesimulationrealistic.Virtualsimulationsarevalid:growingevidenceshowsthatindeedstudiesconductedwithIVETcanreplicatesomewell-knownfindingsofsocialpsychology.Moreover,IVETallowsresearcherstosubtlymanipulatecharacteristicsoftheenvironment(e.g.,visualcuestoprimeparticipants)orofthesocialpartner(e.g.,his/herrace)toinvestigatetheirinfluencesonparticipants'behaviorandcognition.Furthermore,manipulationsthatwouldbedifficultorimpossibleinreallife(e.g.,changingparticipants'height)canbeeasilyobtainedwithIVET.Besidetheadvantagesfortheoreticalresearch,weexplorethemostrecenttrainingandclinicalapplicationsofIVET,itsintegrationwithothertechnologies(e.g.,socialsensing)andfuturechallengesforresearchers(e.g.,makingthecommunicationbetweenvirtualhumansandparticipantssmoother).
Resumo:
The main target of the study was to find ideas for maintenance and development of supplier relations in irregular business environment. The other aim was to find out the suppliers’ opinions concerning the case company and the relationship between the companies. The study was conducted by using both qualitative and quantitative research methods. A mail survey was used to find out supplier opinions and an interview to find out suppliers’ ideas for relationship maintenance and development. It was found out that the use of relational elements is essential in the relationship maintenance in an irregular environment. In development of supplier relations the company should make better use of its suppliers’ potential, assure better flow of information and utilize the possibilities of Supplier Relationship Management.
Resumo:
This Master’s Thesis examines knowledge creation and transfer processes in an iterative project environment. The aim is to understand how knowledge is created and transferred during an actual iterative implementation project which takes place in International Business Machines (IBM). The second aim is to create and develop new working methods that support more effective knowledge creation and transfer for future iterative implementation projects. The research methodology in this thesis is qualitative. Using focus group interviews as a research method provides qualitative information and introduces the experiences of the individuals participating in the project. This study found that the following factors affect knowledge creation and transfer in an iterative, multinational, and multi-organizational implementation project: shared vision and common goal, trust, open communication, social capital, and network density. All of these received both theoretical and empirical support. As for future projects, strengthening these factors was found to be the key for more effective knowledge creation and transfer.
Resumo:
Nowadays, Species Distribution Models (SDMs) are a widely used tool. Using different statistical approaches these models reconstruct the realized niche of a species using presence data and a set of variables, often topoclimatic. There utilization range is quite large from understanding single species requirements, to the creation of nature reserve based on species hotspots, or modeling of climate change impact, etc... Most of the time these models are using variables at a resolution of 50km x 50km or 1 km x 1 km. However in some cases these models are used with resolutions below the kilometer scale and thus called high resolution models (100 m x 100 m or 25 m x 25 m). Quite recently a new kind of data has emerged enabling precision up to lm x lm and thus allowing very high resolution modeling. However these new variables are very costly and need an important amount of time to be processed. This is especially the case when these variables are used in complex calculation like models projections over large areas. Moreover the importance of very high resolution data in SDMs has not been assessed yet and is not well understood. Some basic knowledge on what drive species presence-absences is still missing. Indeed, it is not clear whether in mountain areas like the Alps coarse topoclimatic gradients are driving species distributions or if fine scale temperature or topography are more important or if their importance can be neglected when balance to competition or stochasticity. In this thesis I investigated the importance of very high resolution data (2-5m) in species distribution models using either very high resolution topographic, climatic or edaphic variables over a 2000m elevation gradient in the Western Swiss Alps. I also investigated more local responses of these variables for a subset of species living in this area at two precise elvation belts. During this thesis I showed that high resolution data necessitates very good datasets (species and variables for the models) to produce satisfactory results. Indeed, in mountain areas, temperature is the most important factor driving species distribution and needs to be modeled at very fine resolution instead of being interpolated over large surface to produce satisfactory results. Despite the instinctive idea that topographic should be very important at high resolution, results are mitigated. However looking at the importance of variables over a large gradient buffers the importance of the variables. Indeed topographic factors have been shown to be highly important at the subalpine level but their importance decrease at lower elevations. Wether at the mountane level edaphic and land use factors are more important high resolution topographic data is more imporatant at the subalpine level. Finally the biggest improvement in the models happens when edaphic variables are added. Indeed, adding soil variables is of high importance and variables like pH are overpassing the usual topographic variables in SDMs in term of importance in the models. To conclude high resolution is very important in modeling but necessitate very good datasets. Only increasing the resolution of the usual topoclimatic predictors is not sufficient and the use of edaphic predictors has been highlighted as fundamental to produce significantly better models. This is of primary importance, especially if these models are used to reconstruct communities or as basis for biodiversity assessments. -- Ces dernières années, l'utilisation des modèles de distribution d'espèces (SDMs) a continuellement augmenté. Ces modèles utilisent différents outils statistiques afin de reconstruire la niche réalisée d'une espèce à l'aide de variables, notamment climatiques ou topographiques, et de données de présence récoltées sur le terrain. Leur utilisation couvre de nombreux domaines allant de l'étude de l'écologie d'une espèce à la reconstruction de communautés ou à l'impact du réchauffement climatique. La plupart du temps, ces modèles utilisent des occur-rences issues des bases de données mondiales à une résolution plutôt large (1 km ou même 50 km). Certaines bases de données permettent cependant de travailler à haute résolution, par conséquent de descendre en dessous de l'échelle du kilomètre et de travailler avec des résolutions de 100 m x 100 m ou de 25 m x 25 m. Récemment, une nouvelle génération de données à très haute résolution est apparue et permet de travailler à l'échelle du mètre. Les variables qui peuvent être générées sur la base de ces nouvelles données sont cependant très coûteuses et nécessitent un temps conséquent quant à leur traitement. En effet, tout calcul statistique complexe, comme des projections de distribution d'espèces sur de larges surfaces, demande des calculateurs puissants et beaucoup de temps. De plus, les facteurs régissant la distribution des espèces à fine échelle sont encore mal connus et l'importance de variables à haute résolution comme la microtopographie ou la température dans les modèles n'est pas certaine. D'autres facteurs comme la compétition ou la stochasticité naturelle pourraient avoir une influence toute aussi forte. C'est dans ce contexte que se situe mon travail de thèse. J'ai cherché à comprendre l'importance de la haute résolution dans les modèles de distribution d'espèces, que ce soit pour la température, la microtopographie ou les variables édaphiques le long d'un important gradient d'altitude dans les Préalpes vaudoises. J'ai également cherché à comprendre l'impact local de certaines variables potentiellement négligées en raison d'effets confondants le long du gradient altitudinal. Durant cette thèse, j'ai pu monter que les variables à haute résolution, qu'elles soient liées à la température ou à la microtopographie, ne permettent qu'une amélioration substantielle des modèles. Afin de distinguer une amélioration conséquente, il est nécessaire de travailler avec des jeux de données plus importants, tant au niveau des espèces que des variables utilisées. Par exemple, les couches climatiques habituellement interpolées doivent être remplacées par des couches de température modélisées à haute résolution sur la base de données de terrain. Le fait de travailler le long d'un gradient de température de 2000m rend naturellement la température très importante au niveau des modèles. L'importance de la microtopographie est négligeable par rapport à la topographie à une résolution de 25m. Cependant, lorsque l'on regarde à une échelle plus locale, la haute résolution est une variable extrêmement importante dans le milieu subalpin. À l'étage montagnard par contre, les variables liées aux sols et à l'utilisation du sol sont très importantes. Finalement, les modèles de distribution d'espèces ont été particulièrement améliorés par l'addition de variables édaphiques, principalement le pH, dont l'importance supplante ou égale les variables topographique lors de leur ajout aux modèles de distribution d'espèces habituels.
Resumo:
Due to the existence of free software and pedagogical guides, the use of Data Envelopment Analysis (DEA) has been further democratized in recent years. Nowadays, it is quite usual for practitioners and decision makers with no or little knowledge in operational research to run their own efficiency analysis. Within DEA, several alternative models allow for an environmental adjustment. Four alternative models, each user-friendly and easily accessible to practitioners and decision makers, are performed using empirical data of 90 primary schools in the State of Geneva, Switzerland. Results show that the majority of alternative models deliver divergent results. From a political and a managerial standpoint, these diverging results could lead to potentially ineffective decisions. As no consensus emerges on the best model to use, practitioners and decision makers may be tempted to select the model that is right for them, in other words, the model that best reflects their own preferences. Further studies should investigate how an appropriate multi-criteria decision analysis method could help decision makers to select the right model.
Resumo:
The thesis studies role based access control and its suitability in the enterprise environment. The aim is to research how extensively role based access control can be implemented in the case organization and how it support organization’s business and IT functions. This study points out the enterprise’s needs for access control, factors of access control in the enterprise environment and requirements for implementation and the benefits and challenges it brings along. To find the scope how extensively role based access control can be implemented into the case organization, firstly is examined the actual state of access control. Secondly is defined a rudimentary desired state (how things should be) and thirdly completed it by using the results of the implementation of role based access control application. The study results the role model for case organization unit, and the building blocks and the framework for the organization wide implementation. Ultimate value for organization is delivered by facilitating the normal operations of the organization whilst protecting its information assets.
Resumo:
The purpose of this study was to increase the understanding of the role and nature of trust in asymmetric technology partnership formation. In the knowledge-based "learning race" knowledge is considered as a primary source for competitive advantage. In the emerging ICT sector the high pace of technological change, the convergence of technologies and industries as well as the increasing complexity and uncertainty have forced even the largest players to seek cooperation for complementary knowledge and capabilities. Small technology firms need the complementary resources and legitimacy of the large firms to grow and compete in the global market place. Most of the earlier research indicates, however, that partnerships with asymmetric size, managerial resources and cultures have failed. A basic assumption supported by earlier research was that trust is a critical factor in asymmetric technology partnership formation. Asymmetric technology partnership formation is a dynamic and multi-dimensional process, and consequently a holistic research approach was selected. Research issue was approached from different levels: the individual decision-maker, the firm and the relationship between the parties. Also the impact of the dynamic environment and technology content was analyzed. A multitheoretical approach and a qualitative research method with in-depth interviews in five large ICT companies and eight small ICT companies enabled a holistic and rich view of the research issue. Study contributes on the scarce understanding on the nature and evolution of trust in asymmetric technology partnership formation. It sheds also light on the specific nature of asymmetric technology partnerships. The partnerships were found to be tentative and the diverse strategic intent of small and large technology firms appeared as a major challenge. The role of the boundary spanner was highlighted as a possibility to match the incompatible organizational cultures. A shared vision was found to be a pre-condition for individual-based fast trust leading to intuitive decision-making and experimentation. The relationships were tentative and they were continuously re-evaluated through the key actors' sense making of the technology content, asymmetry and the dynamic environment. A multi-dimensional conceptualization for trust was created and propositions on the role and nature of trust for further research are given. The purpose of this study was to increase the understanding of the role and nature of trust in asymmetric technology partnership formation. In the knowledge-based "learning race" knowledge is considered as a primary source for competitive advantage. In the emerging ICT sector the high pace of technological change, the convergence of technologies and industries as well as the increasing complexity and uncertainty have forced even the largest players to seek cooperation for complementary knowledge and capabilities. Small technology firms need the complementary resources and legitimacy of the large firms to grow and compete in the global market place. Most of the earlier research indicates, however, that partnerships with asymmetric size, managerial resources and cultures have failed. A basic assumption supported by earlier research was that trust is a critical factor in asymmetric technology partnership formation. Asymmetric technology partnership formation is a dynamic and multi-dimensional process, and consequently a holistic research approach was selected. Research issue was approached from different levels: the individual decision-maker, the firm and the relationship between the parties. Also the impact of the dynamic environment and technology content was analyzed. A multitheoretical approach and a qualitative research method with in-depth interviews in five large ICT companies and eight small ICT companies enabled a holistic and rich view of the research issue. Study contributes on the scarce understanding on the nature and evolution of trust in asymmetric technology partnership formation. It sheds also light on the specific nature of asymmetric technology partnerships. The partnerships were found to be tentative and the diverse strategic intent of small and large technology firms appeared as a major challenge. The role of the boundary spanner was highlighted as a possibility to match the incompatible organizational cultures. A shared vision was found to be a pre-condition for individual-based fast trust leading to intuitive decision-making and experimentation. The relationships were tentative and they were continuously re-evaluated through the key actors' sense making of the technology content, asymmetry and the dynamic environment. A multi-dimensional conceptualization for trust was created and propositions on the role and nature of trust for further research are given.
Resumo:
Internationalization and the following rapid growth have created the need to concentrate the IT systems of many small-to-medium-sized production companies. Enterprise Resource Planning systems are a common solution for such companies. Deployment of these ERP systems consists of many steps, one of which is the implementation of the same shared system at all international subsidiaries. This is also one of the most important steps in the internationalization strategy of the company from the IT point of view. The mechanical process of creating the required connections for the off-shore sites is the easiest and most well-documented step along the way, but the actual value of the system, once operational, is perceived in its operational reliability. The operational reliability of an ERP system is a combination of many factors. These factors vary from hardware- and connectivity-related issues to administrative tasks and communication between decentralized administrative units and sites. To accurately analyze the operational reliability of such system, one must take into consideration the full functionality of the system. This includes not only the mechanical and systematic processes but also the users and their administration. All operational reliability in an international environment relies heavily on hardware and telecommunication adequacy so it is imperative to have resources dimensioned with regard to planned usage. Still with poorly maintained communication/administration schemes no amount of bandwidth or memory will be enough to maintain a productive level of reliability. This thesis work analyzes the implementation of a shared ERP system to an international subsidiary of a Finnish production company. The system is Microsoft Dynamics Ax, currently being introduced to a Slovakian facility, a subsidiary of Peikko Finland Oy. The primary task is to create a feasible base of analysis against which the operational reliability of the system can be evaluated precisely. With a solid analysis the aim is to give recommendations on how future implementations are to be managed.