976 resultados para realistic graphics


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The global concern about sustainability has been growing and the mining industry is questioned about its environmental and social performance. Corporate social responsibility (CSR) is an important issue for the extractive industries. The main objective of this study was to investigate the relationship between CSR performance and financial performance of selected mining companies. The study was conducted by identifying and comparing a selection of available CSR performance indicators with financial performance indicators. Based on the result of the study, the relationship between CSR performance and financial performance is unclear for the selected group of companies. The result is mixed and no industry specific realistic way to measure CSR performance uniformly is available. The result as a whole is contradictory and varies at company level as well as based on the selected indicators. The result of this study confirms that the relationship between CSR performance and financial performance is complicated and difficult to determine. As an outcome, evaluation of benefits of CSR in the mining sector could better be analyzed based on different attributes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this study, cantilever-enhanced photoacoustic spectroscopy (CEPAS) was applied in different drug detection schemes. The study was divided into two different applications: trace detection of vaporized drugs and drug precursors in the gas-phase, and detection of cocaine abuse in hair. The main focus, however, was the study of hair samples. In the gas-phase, methyl benzoate, a hydrolysis product of cocaine hydrochloride, and benzyl methyl ketone (BMK), a precursor of amphetamine and methamphetamine were investigated. In the solid-phase, hair samples from cocaine overdose patients were measured and compared to a drug-free reference group. As hair consists mostly of long fibrous proteins generally called keratin, proteins from fingernails and saliva were also studied for comparison. Different measurement setups were applied in this study. Gas measurements were carried out using quantum cascade lasers (QLC) as a source in the photoacoustic detection. Also, an external cavity (EC) design was used for a broader tuning range. Detection limits of 3.4 particles per billion (ppb) for methyl benzoate and 26 ppb for BMK in 0.9 s were achieved with the EC-QCL PAS setup. The achieved detection limits are sufficient for realistic drug detection applications. The measurements from drug overdose patients were carried out using Fourier transform infrared (FTIR) PAS. The drug-containing hair samples and drug-free samples were both measured with the FTIR-PAS setup, and the measured spectra were analyzed statistically with principal component analysis (PCA). The two groups were separated by their spectra with PCA and proper spectral pre-processing. To improve the method, ECQCL measurements of the hair samples, and studies using photoacoustic microsampling techniques, were performed. High quality, high-resolution spectra with a broad tuning range were recorded from a single hair fiber. This broad tuning range of an EC-QCL has not previously been used in the photoacoustic spectroscopy of solids. However, no drug detection studies were performed with the EC-QCL solid-phase setup.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study combines several projects related to the flows in vessels with complex shapes representing different chemical apparata. Three major cases were studied. The first one is a two-phase plate reactor with a complex structure of intersecting micro channels engraved on one plate which is covered by another plain plate. The second case is a tubular microreactor, consisting of two subcases. The first subcase is a multi-channel two-component commercial micromixer (slit interdigital) used to mix two liquid reagents before they enter the reactor. The second subcase is a micro-tube, where the distribution of the heat generated by the reaction was studied. The third case is a conventionally packed column. However, flow, reactions or mass transfer were not modeled. Instead, the research focused on how to describe mathematically the realistic geometry of the column packing, which is rather random and can not be created using conventional computeraided design or engineering (CAD/CAE) methods. Several modeling approaches were used to describe the performance of the processes in the considered vessels. Computational fluid dynamics (CFD) was used to describe the details of the flow in the plate microreactor and micromixer. A space-averaged mass transfer model based on Fick’s law was used to describe the exchange of the species through the gas-liquid interface in the microreactor. This model utilized data, namely the values of the interfacial area, obtained by the corresponding CFD model. A common heat transfer model was used to find the heat distribution in the micro-tube. To generate the column packing, an additional multibody dynamic model was implemented. Auxiliary simulation was carried out to determine the position and orientation of every packing element in the column. This data was then exported into a CAD system to generate desirable geometry, which could further be used for CFD simulations. The results demonstrated that the CFD model of the microreactor could predict the flow pattern well enough and agreed with experiments. The mass transfer model allowed to estimate the mass transfer coefficient. Modeling for the second case showed that the flow in the micromixer and the heat transfer in the tube could be excluded from the larger model which describes the chemical kinetics in the reactor. Results of the third case demonstrated that the auxiliary simulation could successfully generate complex random packing not only for the column but also for other similar cases.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Social media has become a part of many people’s everyday lives. In the library field the adoption of social media has been widespread and discussions of the development of “Library 2.0” began at an early stage. The aim with this thesis is to study the interface between public libraries, social media, and users, focusing on information activities. The main research question is: How is the interface between public libraries and social media perceived and acted upon by its main stakeholders (library professionals and users)? The background of Library 2.0 is strongly associated with the development of the Web and social media, as well as with the public libraries and their user-centered and information technological development. The theoretical framework builds on the research within the area of Library and Information Science concerning information behavior, information practice, and information activities. Earlier research on social media and public libraries is also highlighted in this thesis. The methods survey and content analysis were applied to map the interface between social media and public libraries. A questionnaire was handed out to the users and another questionnaire was sent out to the library professionals. The results were statistically analyzed. In the content analysis public library Facebook pages were studied. All the empirical investigations were conducted in the area of Finland Proper. An integrated analysis of the results deepens the understanding of the key elements of the social media and public library context. These elements are interactivity, information activities, perceptions, and stakeholders. In this context seven information activities were distinguished: reading, seeking, creating, communicating, informing, mediating, and contributing. This thesis contributes to develop the research concerning information activities and draws a realistic picture of the challenges and opportunities in the social media and public library context. It also contributes with knowledge on library professionals and library users, and the existing differences in their perceptions of the interface between libraries and social media.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Taking a realist view that law is one form of politics, this dissertation studies the roles of citizens and organizations in mobilizing the law to request government agencies to disclose environmental information in China, and during this process, how the socio-legal field interacts with the political-legal sphere, and what changes have been brought about during their interactions. This work takes a socio-legal approach and applies methodologies of social science and legal analysis. It aims to understand the paradox of why and how citizens and entities have been invoking the law to access environmental information despite the fact that various obstacles exist and the effectiveness of the new mechanism of environmental information disclosure still remains low. The study is largely based on the 28 cases and eight surveys of environmental information disclosure requests collected by the author. The cases and surveys analysed in this dissertation all occurred between May 2008, when the OGI Regulations and the OEI Measures came into effect, and August 2012 when the case collection was completed. The findings of this study have shown that by invoking the rules of law made by the authorities to demand government agencies disclosing environmental information, the public, including citizens, organizations, law firms, and the media, have strategically created a repercussive pressure upon the authorities to act according to the law. While it is a top-down process that has established the mechanism of open government information in China, it is indeed the bottom-up activism of the public that makes it work. Citizens and organizations’ use of legal tactics to push government agencies to disclose environmental information have formed not only an end of accessing the information but more a means of making government agencies accountable to their legal obligations. Law has thus played a pivotal role in enabling citizen participation in the political process. Against the current situation in China that political campaigns, or politicization, from general election to collective actions, especially contentious actions, are still restrained or even repressed by the government, legal mobilization, or judicialization, that citizens and organizations use legal tactics to demand their rights and push government agencies to enforce the law, become de facto an alternative of political participation. During this process, legal actions have helped to strengthen the civil society, make government agencies act according to law, push back the political boundaries, and induce changes in the relationship between the state and the public. In the field of environmental information disclosure, citizens and organizations have formed a bottom-up social activism, though limited in scope, using the language of law, creating progressive social, legal and political changes. This study emphasizes that it is partial and incomplete to understand China’s transition only from the top-down policy-making and government administration; it is also important to observe it from the bottom-up perspective that in a realistic view law can be part of politics and legal mobilization, even when utterly apolitical, can help to achieve political aims as well. This study of legal mobilization in the field of environmental information disclosure also helps us to better understand the function of law: law is not only a tool for the authorities to regulate and control, but inevitably also a weapon for the public to demand government agencies to work towards their obligations stipulated by the laws issued by themselves.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Global illumination algorithms are at the center of realistic image synthesis and account for non-trivial light transport and occlusion within scenes, such as indirect illumination, ambient occlusion, and environment lighting. Their computationally most difficult part is determining light source visibility at each visible scene point. Height fields, on the other hand, constitute an important special case of geometry and are mainly used to describe certain types of objects such as terrains and to map detailed geometry onto object surfaces. The geometry of an entire scene can also be approximated by treating the distance values of its camera projection as a screen-space height field. In order to shadow height fields from environment lights a horizon map is usually used to occlude incident light. We reduce the per-receiver time complexity of generating the horizon map on N N height fields from O(N) of the previous work to O(1) by using an algorithm that incrementally traverses the height field and reuses the information already gathered along the path of traversal. We also propose an accurate method to integrate the incident light within the limits given by the horizon map. Indirect illumination in height fields requires information about which other points are visible to each height field point. We present an algorithm to determine this intervisibility in a time complexity that matches the space complexity of the produced visibility information, which is in contrast to previous methods which scale in the height field size. As a result the amount of computation is reduced by two orders of magnitude in common use cases. Screen-space ambient obscurance methods approximate ambient obscurance from the depth bu er geometry and have been widely adopted by contemporary real-time applications. They work by sampling the screen-space geometry around each receiver point but have been previously limited to near- field effects because sampling a large radius quickly exceeds the render time budget. We present an algorithm that reduces the quadratic per-pixel complexity of previous methods to a linear complexity by line sweeping over the depth bu er and maintaining an internal representation of the processed geometry from which occluders can be efficiently queried. Another algorithm is presented to determine ambient obscurance from the entire depth bu er at each screen pixel. The algorithm scans the depth bu er in a quick pre-pass and locates important features in it, which are then used to evaluate the ambient obscurance integral accurately. We also propose an evaluation of the integral such that results within a few percent of the ray traced screen-space reference are obtained at real-time render times.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The objective of this Master’s thesis is to create a calculation model for working capital management in value chains. The study has been executed using literature review and constructive research methods. Constructive research methods were mainly modeling. The theory in this thesis is founded in research articles and management literature. The model is developed for students and researchers. They can use the model for working capital management and comparing firms to each other. The model can also be used to cash management. The model tells who benefits and who suffers most in the value chain. Companies and value chains cash flows can be seen. By using the model can be seen are the set targets really achieved. The amount of operational working capital can be observed. The model enables user to simulate the amount of working capital. The created model is based on cash conversion cycle, return on investment and cash flow forecasting. The model is tested with carefully considered figures which seem to be though realistic. The modeled value chain is literally a chain. Implementing this model requires from the user that he/she have some kind of understanding about working capital management and some figures from balance sheet and income statement. By using this model users can improve their knowledge about working capital management in value chains.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study discusses the procedures of value co-creation that persist in gaming industry. The purpose of this study was to identify the procedures that persist in current video gaming industry which answers the main research problem how value is co-created in video gaming industry followed by three sub questions: (i) What is value co-creation in gaming industry? (ii) Who participates in value co-creation in gaming industry? (iii) What are the procedures that are involved in value co-creation in gaming industry? The theoretical background of the study consists of literature relating to the theory of marketing i.e., notion of value, conventional understanding of value creation, value chain, co-creation approach, co-production approach. The research adopted qualitative research approach. As a platform of relationship researcher used web 2.0 tool interface. Data were collected from the social networks and netnography method was applied for analyzing them. Findings show that customer and company both co-create optimum level of value while they interact with each other and within the customers as well. However mostly the C2C interaction, discussions and dialogues threads that emerged around the main discussion facilitated to co-create value. In this manner, companies require exploiting and further motivating, developing and supporting the interactions between customers participating in value creation. Hierarchy of value co-creation processes is the result derived from the identified challenges of value co-creation approach and discussion forums data analysis. Overall three general sets and seven topics were found that explored the phenomenon of customer to customer (C2C) and business to customer (B2C) interaction/debating for value co-creation through user generated contents. These topics describe how gamer contributes and interacts in co-creating value along with companies. A methodical quest in current research literature acknowledged numerous evolving flows of value in this study. These are general management perspective, new product development and innovation, virtual customer environment, service science and service dominant logic. Overall the topics deliver various realistic and conceptual implications for using and handling gamers in social networks for augmenting customers’ value co-creation process.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Työntekijöitä ja heidän tietojaan ja taitojaan pidetään yhtenä yrityksen tär-keimmistä resursseista. Jatkuvasti muuttuvan toimintaympäristön vuoksi on tärkeää huolehtia työntekijöiden terveydestä ja hyvinvoinnista. Työhyvinvoinnista tarvitaan ajankohtaista tietoa, jotta voidaan seurata henkilöstön kehitystä ja luoda realistinen kuva henkilöstön pitkäaikaiseen suorituskykyyn vaikuttavista tekijöistä. Tutkimuksen tavoitteena on tutkia kohdeyrityksen henkilöstöraportoinnin kehitystarpeita. Tutkimus on rajattu kohdeyrityksen sisäiseen raportointiin ja työhyvinvointiraportointiin. Aineisto kerättiin teemahaastatteluiden ja sähköpostihaastatteluiden avulla. Kohdeyrityksessä on tarve uusille henkilöstötunnusluvuille ja säännölliselle, tarpeeksi usein toteutettavalle henkilöstökyselylle. Lisäksi henkilöstökyselyiden tuloksista ja työhyvinvoinnin kehittämisen toimenpiteistä tarvitaan koko henkilöstölle näkyvää raportointia.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Putkipalkkiliitosten käyttäminen offshore-teollisuuden rakennusten tukirakenteissa on erittäin yleistä. Liitosten valmistaminen on hankalaa ja hidasta. Hyvin usein tukirakenteiden putkipalkkiliitokset joudutaan hitsaamaan manuaalisesti tukirakenteen suuren koon vuoksi. Tukirakenteen uudella valmistustavalla, jossa rakenne kootaan pienemmistä osista, voidaan putkipalkkiliitosten valmistaminen ja hitsaaminen automatisoida. Robottihitsausasema sekä sen käyttöliittymä ja ohjelmisto todettiin toimivaksi ratkaisuksi putkipalkkiliitosten hitsaamiseen. Automaatiosuunnitteluun liittyy monia eri vaiheita, joiden huolellinen läpikäynti takaa todenmukaisemman konseptiratkaisun. Konseptiratkaisu kehittyy samalla, kun laitteistoja ja layoutia muokataan valmiimmiksi. Automaatiosuunnittelun aikana pyritään löytämään oikea taso automaatiolle. Valittu automaation taso vaikuttaa tuotannon tuottavuuteen, läpimenoaikaan ja joustavuuteen. Automaation määrällä vaikutetaan myös ihmisen tekemän työn määrään ja työnkuvaan. Tässä diplomityössä kehitettiin Pemamek Oy:lle hitsausautomaatioratkaisuja putkimaisille kappaleille. Putkiston osia valmistavan tehtaan hitsaus- ja tuotantoautomaation konseptiratkaisua tarkasteltiin esimerkkitapauksen muodossa, jolla kuvattiin, kuinka automaatiojärjestelmä voidaan suunnitella konseptitasolle. Toinen hitsausautomaatioratkaisu, joka tässä työssä kehitettiin, on robottihitsausasema käyttöliittymineen putkipalkkiliitoksen hitsaamiseen.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tissues such as skeletal and cardiac muscles must sustain very large-scale changes in ATP turnover rate during equally large changes in work. In many skeletal muscles these changes can exceed 100-fold. Examination of a number of cell and whole-organism level systems identifies ATP concentration as a key parameter of the interior milieu that is nearly universally 'homeostatic'; it is common to observe no change in ATP concentration even while change in its turnover rate can increase or decrease by two orders of magnitude or more. A large number of other intermediates of cellular metabolism are also regulated within narrow concentration ranges, but none seemingly as precisely as is [ATP]. In fact, the only other metabolite in aerobic energy metabolism that is seemingly as 'homeostatic' is oxygen - at least in working muscles where myoglobin serves to buffer oxygen concentrations at stable and constant values at work rates up to the aerobic maximum. In contrast to intracellular oxygen concentration, a 1:1 relationship between oxygen delivery and metabolic rate is observed over biologically realistic and large-magnitude changes in work. The central regulatory question is how the oxygen delivery signal is transmitted to the intracellular metabolic machinery. Traditional explanations assume diffusion as the dominant mechanism, while proponents of an ultrastructurally dominated view of the cell assume an intracellular perfusion system to account for the data which have been most perplexing to metabolic biochemistry so far: the striking lack of correlation between changes in pathway reaction rates and changes in concentrations of pathway substrates, including oxygen and pathway intermediates.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Innovative gas cooled reactors, such as the pebble bed reactor (PBR) and the gas cooled fast reactor (GFR) offer higher efficiency and new application areas for nuclear energy. Numerical methods were applied and developed to analyse the specific features of these reactor types with fully three dimensional calculation models. In the first part of this thesis, discrete element method (DEM) was used for a physically realistic modelling of the packing of fuel pebbles in PBR geometries and methods were developed for utilising the DEM results in subsequent reactor physics and thermal-hydraulics calculations. In the second part, the flow and heat transfer for a single gas cooled fuel rod of a GFR were investigated with computational fluid dynamics (CFD) methods. An in-house DEM implementation was validated and used for packing simulations, in which the effect of several parameters on the resulting average packing density was investigated. The restitution coefficient was found out to have the most significant effect. The results can be utilised in further work to obtain a pebble bed with a specific packing density. The packing structures of selected pebble beds were also analysed in detail and local variations in the packing density were observed, which should be taken into account especially in the reactor core thermal-hydraulic analyses. Two open source DEM codes were used to produce stochastic pebble bed configurations to add realism and improve the accuracy of criticality calculations performed with the Monte Carlo reactor physics code Serpent. Russian ASTRA criticality experiments were calculated. Pebble beds corresponding to the experimental specifications within measurement uncertainties were produced in DEM simulations and successfully exported into the subsequent reactor physics analysis. With the developed approach, two typical issues in Monte Carlo reactor physics calculations of pebble bed geometries were avoided. A novel method was developed and implemented as a MATLAB code to calculate porosities in the cells of a CFD calculation mesh constructed over a pebble bed obtained from DEM simulations. The code was further developed to distribute power and temperature data accurately between discrete based reactor physics and continuum based thermal-hydraulics models to enable coupled reactor core calculations. The developed method was also found useful for analysing sphere packings in general. CFD calculations were performed to investigate the pressure losses and heat transfer in three dimensional air cooled smooth and rib roughened rod geometries, housed inside a hexagonal flow channel representing a sub-channel of a single fuel rod of a GFR. The CFD geometry represented the test section of the L-STAR experimental facility at Karlsruhe Institute of Technology and the calculation results were compared to the corresponding experimental results. Knowledge was gained of the adequacy of various turbulence models and of the modelling requirements and issues related to the specific application. The obtained pressure loss results were in a relatively good agreement with the experimental data. Heat transfer in the smooth rod geometry was somewhat under predicted, which can partly be explained by unaccounted heat losses and uncertainties. In the rib roughened geometry heat transfer was severely under predicted by the used realisable k − epsilon turbulence model. An additional calculation with a v2 − f turbulence model showed significant improvement in the heat transfer results, which is most likely due to the better performance of the model in separated flow problems. Further investigations are suggested before using CFD to make conclusions of the heat transfer performance of rib roughened GFR fuel rod geometries. It is suggested that the viewpoints of numerical modelling are included in the planning of experiments to ease the challenging model construction and simulations and to avoid introducing additional sources of uncertainties. To facilitate the use of advanced calculation approaches, multi-physical aspects in experiments should also be considered and documented in a reasonable detail.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The use of exact coordinates of pebbles and fuel particles of pebble bed reactor modelling becoming possible in Monte Carlo reactor physics calculations is an important development step. This allows exact modelling of pebble bed reactors with realistic pebble beds without the placing of pebbles in regular lattices. In this study the multiplication coefficient of the HTR-10 pebble bed reactor is calculated with the Serpent reactor physics code and, using this multiplication coefficient, the amount of pebbles required for the critical load of the reactor. The multiplication coefficient is calculated using pebble beds produced with the discrete element method and three different material libraries in order to compare the results. The received results are lower than those from measured at the experimental reactor and somewhat lower than those gained with other codes in earlier studies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Wind energy has obtained outstanding expectations due to risks of global warming and nuclear energy production plant accidents. Nowadays, wind farms are often constructed in areas of complex terrain. A potential wind farm location must have the site thoroughly surveyed and the wind climatology analyzed before installing any hardware. Therefore, modeling of Atmospheric Boundary Layer (ABL) flows over complex terrains containing, e.g. hills, forest, and lakes is of great interest in wind energy applications, as it can help in locating and optimizing the wind farms. Numerical modeling of wind flows using Computational Fluid Dynamics (CFD) has become a popular technique during the last few decades. Due to the inherent flow variability and large-scale unsteadiness typical in ABL flows in general and especially over complex terrains, the flow can be difficult to be predicted accurately enough by using the Reynolds-Averaged Navier-Stokes equations (RANS). Large- Eddy Simulation (LES) resolves the largest and thus most important turbulent eddies and models only the small-scale motions which are more universal than the large eddies and thus easier to model. Therefore, LES is expected to be more suitable for this kind of simulations although it is computationally more expensive than the RANS approach. With the fast development of computers and open-source CFD software during the recent years, the application of LES toward atmospheric flow is becoming increasingly common nowadays. The aim of the work is to simulate atmospheric flows over realistic and complex terrains by means of LES. Evaluation of potential in-land wind park locations will be the main application for these simulations. Development of the LES methodology to simulate the atmospheric flows over realistic terrains is reported in the thesis. The work also aims at validating the LES methodology at a real scale. In the thesis, LES are carried out for flow problems ranging from basic channel flows to real atmospheric flows over one of the most recent real-life complex terrain problems, the Bolund hill. All the simulations reported in the thesis are carried out using a new OpenFOAM® -based LES solver. The solver uses the 4th order time-accurate Runge-Kutta scheme and a fractional step method. Moreover, development of the LES methodology includes special attention to two boundary conditions: the upstream (inflow) and wall boundary conditions. The upstream boundary condition is generated by using the so-called recycling technique, in which the instantaneous flow properties are sampled on aplane downstream of the inlet and mapped back to the inlet at each time step. This technique develops the upstream boundary-layer flow together with the inflow turbulence without using any precursor simulation and thus within a single computational domain. The roughness of the terrain surface is modeled by implementing a new wall function into OpenFOAM® during the thesis work. Both, the recycling method and the newly implemented wall function, are validated for the channel flows at relatively high Reynolds number before applying them to the atmospheric flow applications. After validating the LES model over simple flows, the simulations are carried out for atmospheric boundary-layer flows over two types of hills: first, two-dimensional wind-tunnel hill profiles and second, the Bolund hill located in Roskilde Fjord, Denmark. For the twodimensional wind-tunnel hills, the study focuses on the overall flow behavior as a function of the hill slope. Moreover, the simulations are repeated using another wall function suitable for smooth surfaces, which already existed in OpenFOAM® , in order to study the sensitivity of the flow to the surface roughness in ABL flows. The simulated results obtained using the two wall functions are compared against the wind-tunnel measurements. It is shown that LES using the implemented wall function produces overall satisfactory results on the turbulent flow over the two-dimensional hills. The prediction of the flow separation and reattachment-length for the steeper hill is closer to the measurements than the other numerical studies reported in the past for the same hill geometry. The field measurement campaign performed over the Bolund hill provides the most recent field-experiment dataset for the mean flow and the turbulence properties. A number of research groups have simulated the wind flows over the Bolund hill. Due to the challenging features of the hill such as the almost vertical hill slope, it is considered as an ideal experimental test case for validating micro-scale CFD models for wind energy applications. In this work, the simulated results obtained for two wind directions are compared against the field measurements. It is shown that the present LES can reproduce the complex turbulent wind flow structures over a complicated terrain such as the Bolund hill. Especially, the present LES results show the best prediction of the turbulent kinetic energy with an average error of 24.1%, which is a 43% smaller than any other model results reported in the past for the Bolund case. Finally, the validated LES methodology is demonstrated to simulate the wind flow over the existing Muukko wind farm located in South-Eastern Finland. The simulation is carried out only for one wind direction and the results on the instantaneous and time-averaged wind speeds are briefly reported. The demonstration case is followed by discussions on the practical aspects of LES for the wind resource assessment over a realistic inland wind farm.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The brain is a complex system, which produces emergent properties such as those associated with activity-dependent plasticity in processes of learning and memory. Therefore, understanding the integrated structures and functions of the brain is well beyond the scope of either superficial or extremely reductionistic approaches. Although a combination of zoom-in and zoom-out strategies is desirable when the brain is studied, constructing the appropriate interfaces to connect all levels of analysis is one of the most difficult challenges of contemporary neuroscience. Is it possible to build appropriate models of brain function and dysfunctions with computational tools? Among the best-known brain dysfunctions, epilepsies are neurological syndromes that reach a variety of networks, from widespread anatomical brain circuits to local molecular environments. One logical question would be: are those complex brain networks always producing maladaptive emergent properties compatible with epileptogenic substrates? The present review will deal with this question and will try to answer it by illustrating several points from the literature and from our laboratory data, with examples at the behavioral, electrophysiological, cellular and molecular levels. We conclude that, because the brain is a complex system compatible with the production of emergent properties, including plasticity, its functions should be approached using an integrated view. Concepts such as brain networks, graphics theory, neuroinformatics, and e-neuroscience are discussed as new transdisciplinary approaches dealing with the continuous growth of information about brain physiology and its dysfunctions. The epilepsies are discussed as neurobiological models of complex systems displaying maladaptive plasticity.