940 resultados para Temporal constraints analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The medial temporal lobe (MTL)-comprising hippocampus and the surrounding neocortical regions-is a targeted brain area sensitive to several neurological diseases. Although functional magnetic resonance imaging (fMRI) has been widely used to assess brain functional abnormalities, detecting MTL activation has been technically challenging. The aim of our study was to provide an fMRI paradigm that reliably activates MTL regions at the individual level, thus providing a useful tool for future research in clinical memory-related studies. Twenty young healthy adults underwent an event-related fMRI study consisting of three encoding conditions: word-pairs, face-name associations and complex visual scenes. A region-of-interest analysis at the individual level comparing novel and repeated stimuli independently for each task was performed. The results of this analysis yielded activations in the hippocampal and parahippocampal regions in most of the participants. Specifically, 95% and 100% of participants showed significant activations in the left hippocampus during the face-name encoding and in the right parahippocampus, respectively, during scene encoding. Additionally, a whole brain analysis, also comparing novel versus repeated stimuli at the group level, showed mainly left frontal activation during the word task. In this group analysis, the face-name association engaged the HP and fusiform gyri bilaterally, along with the left inferior frontal gyrus, and the complex visual scenes activated mainly the parahippocampus and hippocampus bilaterally. In sum, our task design represents a rapid and reliable manner to study and explore MTL activity at the individual level, thus providing a useful tool for future research in clinical memory-related fMRI studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work aims to study spatial and seasonal variability of some chemical-physical parameters in the Turvo/Grande watershed, São Paulo State, Brazil. Water samples were taken monthly, 2007/07-2008/11, from fourteen sampling stations sited along the Turvo, Preto and Grande Rivers and its main tributaries. The Principal Component Analysis and hierarchical cluster analysis showed two distinct groups in this watershed, the first one associated for the places more impacted by domestic effluent (lower levels of dissolved oxygen in the studied region). The sampling places located to downstream (Turvo and Grande rivers) were discriminate by diffuse source of pollutants from flooding and agriculture runoffs in a second group.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In general, laboratory activities are costly in terms of time, space, and money. As such, the ability to provide realistically simulated laboratory data that enables students to practice data analysis techniques as a complementary activity would be expected to reduce these costs while opening up very interesting possibilities. In the present work, a novel methodology is presented for design of analytical chemistry instrumental analysis exercises that can be automatically personalized for each student and the results evaluated immediately. The proposed system provides each student with a different set of experimental data generated randomly while satisfying a set of constraints, rather than using data obtained from actual laboratory work. This allows the instructor to provide students with a set of practical problems to complement their regular laboratory work along with the corresponding feedback provided by the system's automatic evaluation process. To this end, the Goodle Grading Management System (GMS), an innovative web-based educational tool for automating the collection and assessment of practical exercises for engineering and scientific courses, was developed. The proposed methodology takes full advantage of the Goodle GMS fusion code architecture. The design of a particular exercise is provided ad hoc by the instructor and requires basic Matlab knowledge. The system has been employed with satisfactory results in several university courses. To demonstrate the automatic evaluation process, three exercises are presented in detail. The first exercise involves a linear regression analysis of data and the calculation of the quality parameters of an instrumental analysis method. The second and third exercises address two different comparison tests, a comparison test of the mean and a t-paired test.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The progress of the severity of southern rust in maize (Zea mays) caused by Puccinia polysora was quantified in staggered plantings in different geographical areas in Brazil, from October to May, over two years (1995-1996 and 1996-1997). The logistic model, fitted to the data, better described the disease progress curves than the Gompertz model. Four components of the disease progress curves (maximum disease severity; area under the disease progress curve, AUDPC; area under the disease progress curve around the inflection point, AUDPCi; and epidemic rate) were used to compare the epidemics in different areas and at different times of planting. The AUDPC, AUDPCi, and the epidemic rate were analyzed in relation to the weather (temperature, relative humidity, hours of relative humidity >90%, and rainfall) and recorded during the trials. Disease severity reached levels greater than 30% in Piracicaba and Guaíra in the plantings between December and January. Lower values of AUDPC occurred in later plantings at both locations. The epidemic rate was positively correlated (P < 0.05) with the mean daily temperatures and negatively correlated with hours of relative humidity >90%. The AUDPC was not correlated with any weather variable. The AUDPCi was negatively related to both variables connected to humidity, but not to rain. Long periods (mostly >13 h day-1) of relative humidity >90% (that corresponded to leaf wetness) occurred in Castro. Severity of southern rust in maize has always been low in Castro, thus the negative correlations between disease and the two humidity variables.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The existence of a minimum storage capacity of grains as a condition for the maintenance of regulator physical stocks has been used as a strategic factor in the agribusiness expansion. However, in Brazil the storage infrastructure has not followed the growth of the agricultural sector. This fact is evident in the case of soybeans that currently represent 49% of grain production in the country, whose volume production has been increasing significantly over the years. This study aimed to predict the futureneeds of static storage capacity of soybeans from historical data to estimate the investment needed to install storage units in Brazil for the next five years. A statistic analysis of collected data allowed a forecast and identification of the number of storage units that should be installed to meet the storage needs of soybeans in the next five years. It was concluded that by 2015 the soybean storage capacity should be 87 million tons, and to store 49% of soybeans produced, 1,104 storage units should be installed at a cost of R$ 442 million.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Clustering soil and crop data can be used as a basis for the definition of management zones because the data are grouped into clusters based on the similar interaction of these variables. Therefore, the objective of this study was to identify management zones using fuzzy c-means clustering analysis based on the spatial and temporal variability of soil attributes and corn yield. The study site (18 by 250-m in size) was located in Jaboticabal, São Paulo/Brazil. Corn yield was measured in one hundred 4.5 by 10-m cells along four parallel transects (25 observations per transect) over five growing seasons between 2001 and 2010. Soil chemical and physical attributes were measured. SAS procedure MIXED was used to identify which variable(s) most influenced the spatial variability of corn yield over the five study years. Basis saturation (BS) was the variable that better related to corn yield, thus, semivariograms models were fitted for BS and corn yield and then, data values were krigged. Management Zone Analyst software was used to carry out the fuzzy c-means clustering algorithm. The optimum number of management zones can change over time, as well as the degree of agreement between the BS and corn yield management zone maps. Thus, it is very important take into account the temporal variability of crop yield and soil attributes to delineate management zones accurately.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wind power is a low-carbon energy production form that reduces the dependence of society on fossil fuels. Finland has adopted wind energy production into its climate change mitigation policy, and that has lead to changes in legislation, guidelines, regional wind power areas allocation and establishing a feed-in tariff. Wind power production has indeed boosted in Finland after two decades of relatively slow growth, for instance from 2010 to 2011 wind energy production increased with 64 %, but there is still a long way to the national goal of 6 TWh by 2020. This thesis introduces a GIS-based decision-support methodology for the preliminary identification of suitable areas for wind energy production including estimation of their level of risk. The goal of this study was to define the least risky places for wind energy development within Kemiönsaari municipality in Southwest Finland. Spatial multicriteria decision analysis (SMCDA) has been used for searching suitable wind power areas along with many other location-allocation problems. SMCDA scrutinizes complex ill-structured decision problems in GIS environment using constraints and evaluation criteria, which are aggregated using weighted linear combination (WLC). Weights for the evaluation criteria were acquired using analytic hierarchy process (AHP) with nine expert interviews. Subsequently, feasible alternatives were ranked in order to provide a recommendation and finally, a sensitivity analysis was conducted for the determination of recommendation robustness. The first study aim was to scrutinize the suitability and necessity of existing data for this SMCDA study. Most of the available data sets were of sufficient resolution and quality. Input data necessity was evaluated qualitatively for each data set based on e.g. constraint coverage and attribute weights. Attribute quality was estimated mainly qualitatively by attribute comprehensiveness, operationality, measurability, completeness, decomposability, minimality and redundancy. The most significant quality issue was redundancy as interdependencies are not tolerated by WLC and AHP does not include measures to detect them. The third aim was to define the least risky areas for wind power development within the study area. The two highest ranking areas were Nordanå-Lövböle and Påvalsby followed by Helgeboda, Degerdal, Pungböle, Björkboda, and Östanå-Labböle. The fourth aim was to assess the recommendation reliability, and the top-ranking two areas proved robust whereas the other ones were more sensitive.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis aims to find an effective way of conducting a target audience analysis (TAA) in cyber domain. There are two main focal points that are addressed; the nature of the cyber domain and the method of the TAA. Of the cyber domain the object is to find the opportunities, restrictions and caveats that result from its digital and temporal nature. This is the environment in which the TAA method is examined in this study. As the TAA is an important step of any psychological operation and critical to its success, the method used must cover all the main aspects affecting the choice of a proper target audience. The first part of the research was done by sending an open-ended questionnaire to operators in the field of information warfare both in Finland and abroad. As the results were inconclusive, the research was completed by assessing the applicability of United States Army Joint Publication FM 3-05.301 in the cyber domain via a theory-based content analysis. FM 3- 05.301 was chosen because it presents a complete method of the TAA process. The findings were tested against the results of the questionnaire and new scientific research in the field of psychology. The cyber domain was found to be “fast and vast”, volatile and uncontrollable. Although governed by laws to some extent, the cyber domain is unpredictable by nature and not controllable to reasonable amount. The anonymity and lack of verification often present in the digital channels mean that anyone can have an opinion, and any message sent may change or even be counterproductive to the original purpose. The TAA method of the FM 3-05.301 is applicable in the cyber domain, although some parts of the method are outdated and thus suggested to be updated if used in that environment. The target audience categories of step two of the process were replaced by new groups that exist in the digital environment. The accessibility assessment (step eight) was also redefined, as in the digital media the mere existence of a written text is typically not enough to convey the intended message to the target audience. The scientific studies made in computer sciences and both in psychology and sociology about the behavior of people in social media (and overall in cyber domain) call for a more extensive remake of the TAA process. This falls, however, out of the scope of this work. It is thus suggested that further research should be carried out in search of computer-assisted methods and a more thorough TAA process, utilizing the latest discoveries of human behavior. ---------------------------------------------------------------------------------------------------------------------------------- Tämän opinnäytetyön tavoitteena on löytää tehokas tapa kohdeyleisöanalyysin tekemiseksi kybertoimintaympäristössä. Työssä keskitytään kahteen ilmiöön: kybertoimintaympäristön luonteeseen ja kohdeyleisöanalyysin metodiin. Kybertoimintaympäristön osalta tavoitteena on löytää sen digitaalisesta ja ajallisesta luonteesta juontuvat mahdollisuudet, rajoitteet ja sudenkuopat. Tämä on se ympäristö jossa kohdeyleisöanalyysiä tarkastellaan tässä työssä. Koska kohdeyleisöanalyysi kuuluu olennaisena osana jokaiseen psykologiseen operaatioon ja on onnistumisen kannalta kriittinen tekijä, käytettävän metodin tulee pitää sisällään kaikki oikean kohdeyleisön valinnan kannalta merkittävät osa-alueet. Tutkimuksen ensimmäisessä vaiheessa lähetettiin avoin kysely informaatiosodankäynnin ammattilaisille Suomessa ja ulkomailla. Koska kyselyn tulokset eivät olleet riittäviä johtopäätösten tekemiseksi, tutkimusta jatkettiin tarkastelemalla Yhdysvaltojen armeijan kenttäohjesäännön FM 3-05.301 soveltuvuutta kybertoimintaympäristössä käytettäväksi teorialähtöisen sisällönanalyysin avulla. FM 3-05.301 valittiin koska se sisältää kokonaisvaltaisen kohdeyleisöanalyysiprosessin. Havaintoja verrattiin kyselytutkimuksen tuloksiin ja psykologian uusiin tutkimuksiin. Kybertoimintaympäristö on tulosten perusteella nopea ja valtava, jatkuvasti muuttuva ja kontrolloimaton. Vaikkakin lait hallitsevat kybertoimintaympäristöä jossakin määrin, on se silti luonteeltaan ennakoimaton eikä sitä voida luotettavasti hallita. Digitaalisilla kanavilla usein läsnäoleva nimettömyys ja tiedon tarkastamisen mahdottomuus tarkoittavat että kenellä tahansa voi olla mielipide asioista, ja mikä tahansa viesti voi muuttua, jopa alkuperäiseen tarkoitukseen nähden vastakkaiseksi. FM 3-05.301:n metodi toimii kybertoimintaympäristössä, vaikkakin jotkin osa-alueet ovat vanhentuneita ja siksi ne esitetään päivitettäväksi mikäli metodia käytetään kyseisessä ympäristössä. Kohdan kaksi kohdeyleisökategoriat korvattiin uusilla, digitaalisessa ympäristössä esiintyvillä ryhmillä. Lähestyttävyyden arviointi (kohta 8) muotoiltiin myös uudestaan, koska digitaalisessa mediassa pelkkä tekstin läsnäolo ei sellaisenaan tyypillisesti vielä riitä halutun viestin välittämiseen kohdeyleisölle. Tietotekniikan edistyminen ja psykologian sekä sosiologian aloilla tehty tieteellinen tutkimus ihmisten käyttäytymisestä sosiaalisessa mediassa (ja yleensä kybertoimintaympäristössä) mahdollistavat koko kohdeyleisöanalyysiprosessin uudelleenrakentamisen. Tässä työssä sitä kuitenkaan ei voida tehdä. Siksi esitetäänkin että lisätutkimusta tulisi tehdä sekä tietokoneavusteisten prosessien että vielä syvällisempien kohdeyleisöanalyysien osalta, käyttäen hyväksi viimeisimpiä ihmisen käyttäytymiseen liittyviä tutkimustuloksia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Concentration-response curves of isometric tension studies on isolated blood vessels are obtained traditionally. Although parameters such as Imax, EC50 and pA2 may be readily calculated, this method does not provide information on the temporal profile of the responses or the actual nature of the reaction curves. Computerized data acquisition systems can be used to obtain average data that represent a new source of otherwise inaccessible information, since early and late responses may be observed separately in detail

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the market where companies similar in size and resources are competing, it is challenging to have any advantage over others. In order to stay afloat company needs to have capability to perform with fewer resources and yet provide better service. Hence development of efficient processes which can cut costs and improve performance is crucial. As business expands, processes become complicated and large amount of data needs to be managed and available on request. Different tools are used in companies to store and manage data, which facilitates better production and transactions. In the modern business world the most utilized tool for that purpose is ERP - Enterprise Resource Planning system. The focus of this research is to study how competitive advantage can be achieved by implementing proprietary ERP system in the company; ERP system that is in-house created, tailor made to match and align business needs and processes. Market is full of ERP software, but choosing the right one is a big challenge. Identifying the key features that need improvement in processes and data management, choosing the right ERP, implementing it and the follow-up is a long and expensive journey companies undergo. Some companies prefer to invest in a ready-made package bought from vendor and adjust it according to own business needs, while others focus on creating own system with in-house IT capabilities. In this research a case company is used and author tries to identify and analyze why organization in question decided to pursue the development of proprietary ERP system, how it has been implemented and whether it has been successful. Main conclusion and recommendation of this research is for companies to know core capabilities and constraints before choosing and implementing ERP system. Knowledge of factors that affect system change outcome is important, to make the right decisions on strategic level and implement on operational level. Duration of the project in the case company has lasted longer than anticipated. It has been reported that in cases of buying ready product from vendor, projects are delayed and completed over budget as well. In general, in case company implementation of proprietary ERP has been successful both from business performance figures and usability of system by employees. In terms of future research, conducting a study to calculate statistically ROI of both approaches; of buying ready product and creating own ERP will be beneficial.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis, the suitability of different trackers for finger tracking in high-speed videos was studied. Tracked finger trajectories from the videos were post-processed and analysed using various filtering and smoothing methods. Position derivatives of the trajectories, speed and acceleration were extracted for the purposes of hand motion analysis. Overall, two methods, Kernelized Correlation Filters and Spatio-Temporal Context Learning tracking, performed better than the others in the tests. Both achieved high accuracy for the selected high-speed videos and also allowed real-time processing, being able to process over 500 frames per second. In addition, the results showed that different filtering methods can be applied to produce more appropriate velocity and acceleration curves calculated from the tracking data. Local Regression filtering and Unscented Kalman Smoother gave the best results in the tests. Furthermore, the results show that tracking and filtering methods are suitable for high-speed hand-tracking and trajectory-data post-processing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Preparative liquid chromatography is one of the most selective separation techniques in the fine chemical, pharmaceutical, and food industries. Several process concepts have been developed and applied for improving the performance of classical batch chromatography. The most powerful approaches include various single-column recycling schemes, counter-current and cross-current multi-column setups, and hybrid processes where chromatography is coupled with other unit operations such as crystallization, chemical reactor, and/or solvent removal unit. To fully utilize the potential of stand-alone and integrated chromatographic processes, efficient methods for selecting the best process alternative as well as optimal operating conditions are needed. In this thesis, a unified method is developed for analysis and design of the following singlecolumn fixed bed processes and corresponding cross-current schemes: (1) batch chromatography, (2) batch chromatography with an integrated solvent removal unit, (3) mixed-recycle steady state recycling chromatography (SSR), and (4) mixed-recycle steady state recycling chromatography with solvent removal from fresh feed, recycle fraction, or column feed (SSR–SR). The method is based on the equilibrium theory of chromatography with an assumption of negligible mass transfer resistance and axial dispersion. The design criteria are given in general, dimensionless form that is formally analogous to that applied widely in the so called triangle theory of counter-current multi-column chromatography. Analytical design equations are derived for binary systems that follow competitive Langmuir adsorption isotherm model. For this purpose, the existing analytic solution of the ideal model of chromatography for binary Langmuir mixtures is completed by deriving missing explicit equations for the height and location of the pure first component shock in the case of a small feed pulse. It is thus shown that the entire chromatographic cycle at the column outlet can be expressed in closed-form. The developed design method allows predicting the feasible range of operating parameters that lead to desired product purities. It can be applied for the calculation of first estimates of optimal operating conditions, the analysis of process robustness, and the early-stage evaluation of different process alternatives. The design method is utilized to analyse the possibility to enhance the performance of conventional SSR chromatography by integrating it with a solvent removal unit. It is shown that the amount of fresh feed processed during a chromatographic cycle and thus the productivity of SSR process can be improved by removing solvent. The maximum solvent removal capacity depends on the location of the solvent removal unit and the physical solvent removal constraints, such as solubility, viscosity, and/or osmotic pressure limits. Usually, the most flexible option is to remove solvent from the column feed. Applicability of the equilibrium design for real, non-ideal separation problems is evaluated by means of numerical simulations. Due to assumption of infinite column efficiency, the developed design method is most applicable for high performance systems where thermodynamic effects are predominant, while significant deviations are observed under highly non-ideal conditions. The findings based on the equilibrium theory are applied to develop a shortcut approach for the design of chromatographic separation processes under strongly non-ideal conditions with significant dispersive effects. The method is based on a simple procedure applied to a single conventional chromatogram. Applicability of the approach for the design of batch and counter-current simulated moving bed processes is evaluated with case studies. It is shown that the shortcut approach works the better the higher the column efficiency and the lower the purity constraints are.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In 2006 UPM was able to gain a level of social legitimacy that allowed it to carry out one of the largest industrial restructuring programmes in Finnish industrial history, shut down major operations in Finland and still appear to be functioning in the interests of the nation as well as itself. This study considers and examines various contexts of this shutdown with the aim of demonstrating how profoundly mediated such organizational events are though they appear to be produced primarily through strategic company decisions. The study aims to examine the processes of mediation at two levels. At one level, through close analysis of press releases and newspaper reports in local and national newspapers, the study presents a discursive analysis of the Voikkaa case. The discursive analysis focuses on providing historical contexts for understanding why this organizational event was also an occasion for reimagining the past and future of the Finnish nation; spatial contexts for understanding the differing struggles over the meaning of the event nationally and regionally; and the temporal dynamics of the media reports. At another level, the study considers and refines methods for reading and analyzing mediation in organization studies. Bringing together recent research of media text–based legitimation studies, emerging research on organizational memory and organizational death and a Foucaultian analytics of power, this work suggests that organizational research needs to be less concerned with particular typologies and narratives of shutdowns, and more curious about the processes of mediation through which organizational events are imagined and remembered.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this study, the behavioral and electroencephalographic (EEG) analysis of seizures induced by the intrahippocampal injection in rats of granulitoxin, a neurotoxic peptide from the sea anemone Bunodosoma granulifera, was determined. The first alterations occurred during microinjection of granulitoxin (8 µg) into the dorsal hippocampus and consisted of seizure activity that began in the hippocampus and spread rapidly to the occipital cortex. This activity lasted 20-30 s, and during this period the rats presented immobility. During the first 40-50 min after its administration, three to four other similar short EEG seizure periods occurred and the rats presented the following behavioral alterations: akinesia, facial automatisms, head tremor, salivation, rearing, jumping, barrel-rolling, wet dog shakes and forelimb clonic movements. Within 40-50 min, the status epilepticus was established and lasted 8-12 h. These results are similar to those observed in the acute phase of the pilocarpine model of temporal lobe epilepsy and suggest that granulitoxin may be a useful tool not only to study the sodium channels, but also to develop a new experimental model of status epilepticus.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reported neuroimaging studies have shown functional and morphological changes of temporal lobe structures in panic patients, but only one used a volumetric method. The aim of the present study was to determine the volume of temporal lobe structures in patients with panic disorder, measured by magnetic resonance imaging. Eleven panic patients and eleven controls matched for age, sex, handedness, socioeconomic status and years of education participated in the study. The mean volume of the left temporal lobe of panic patients was 9% smaller than that of controls (t21 = 2.37, P = 0.028). In addition, there was a trend (P values between 0.05 and 0.10) to smaller volumes of the right temporal lobe (7%, t21 = 1.99, P = 0.06), right amygdala (8%, t21 = 1.83, P = 0.08), left amygdala (5%, t21 = 1.78, P = 0.09) and left hippocampus (9%, t21 = 1.93, P = 0.07) in panic patients compared to controls. There was a positive correlation between left hippocampal volume and duration of panic disorder (r = 0.67, P = 0.025), with recent cases showing more reduction than older cases. The present results show that panic patients have a decreased volume of the left temporal lobe and indicate the presence of volumetric abnormalities of temporal lobe structures.