914 resultados para Multiple-scale processing


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hebb proposed that synapses between neurons that fire synchronously are strengthened, forming cell assemblies and phase sequences. The former, on a shorter scale, are ensembles of synchronized cells that function transiently as a closed processing system; the latter, on a larger scale, correspond to the sequential activation of cell assemblies able to represent percepts and behaviors. Nowadays, the recording of large neuronal populations allows for the detection of multiple cell assemblies. Within Hebb's theory, the next logical step is the analysis of phase sequences. Here we detected phase sequences as consecutive assembly activation patterns, and then analyzed their graph attributes in relation to behavior. We investigated action potentials recorded from the adult rat hippocampus and neocortex before, during and after novel object exploration (experimental periods). Within assembly graphs, each assembly corresponded to a node, and each edge corresponded to the temporal sequence of consecutive node activations. The sum of all assembly activations was proportional to firing rates, but the activity of individual assemblies was not. Assembly repertoire was stable across experimental periods, suggesting that novel experience does not create new assemblies in the adult rat. Assembly graph attributes, on the other hand, varied significantly across behavioral states and experimental periods, and were separable enough to correctly classify experimental periods (Naïve Bayes classifier; maximum AUROCs ranging from 0.55 to 0.99) and behavioral states (waking, slow wave sleep, and rapid eye movement sleep; maximum AUROCs ranging from 0.64 to 0.98). Our findings agree with Hebb's view that assemblies correspond to primitive building blocks of representation, nearly unchanged in the adult, while phase sequences are labile across behavioral states and change after novel experience. The results are compatible with a role for phase sequences in behavior and cognition.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The air-sea flux of greenhouse gases (e.g. carbon dioxide, CO2) is a critical part of the climate system and a major factor in the biogeochemical development of the oceans. More accurate and higher resolution calculations of these gas fluxes are required if we are to fully understand and predict our future climate. Satellite Earth observation is able to provide large spatial scale datasets that can be used to study gas fluxes. However, the large storage requirements needed to host such data can restrict its use by the scientific community. Fortunately, the development of cloud-computing can provide a solution. Here we describe an open source air-sea CO2 flux processing toolbox called the ‘FluxEngine’, designed for use on a cloud-computing infrastructure. The toolbox allows users to easily generate global and regional air-sea CO2 flux data from model, in situ and Earth observation data, and its air-sea gas flux calculation is user configurable. Its current installation on the Nephalae cloud allows users to easily exploit more than 8 terabytes of climate-quality Earth observation data for the derivation of gas fluxes. The resultant NetCDF data output files contain >20 data layers containing the various stages of the flux calculation along with process indicator layers to aid interpretation of the data. This paper describes the toolbox design, the verification of the air-sea CO2 flux calculations, demonstrates the use of the tools for studying global and shelf-sea air-sea fluxes and describes future developments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives Dietary fibre (DF) is one of the components of diet that strongly contributes to health improvements, particularly on the gastrointestinal system. Hence, this work intended to evaluate the relations between some sociodemographic variables such as age, gender, level of education, living environment or country on the levels of knowledge about dietary fibre (KADF), its sources and its effects on human health, using a validated scale. Study design The present study was a cross-sectional study. Methods A methodological study was conducted with 6010 participants, residing in 10 countries from different continents (Europe, America, Africa). The instrument was a questionnaire of self-response, aimed at collecting information on knowledge about food fibres. The instrument was used to validate a scale (KADF) which model was used in the present work to identify the best predictors of knowledge. The statistical tools used were as follows: basic descriptive statistics, decision trees, inferential analysis (t-test for independent samples with Levene test and one-way ANOVA with multiple comparisons post hoc tests). Results The results showed that the best predictor for the three types of knowledge evaluated (about DF, about its sources and about its effects on human health) was always the country, meaning that the social, cultural and/or political conditions greatly determine the level of knowledge. On the other hand, the tests also showed that statistically significant differences were encountered regarding the three types of knowledge for all sociodemographic variables evaluated: age, gender, level of education, living environment and country. Conclusions The results showed that to improve the level of knowledge the actions planned should not be delineated in general as to reach all sectors of the populations, and that in addressing different people, different methodologies must be designed so as to provide an effective health education.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Measuring the extent to which a piece of structural timber has distorted at a macroscopic scale is fundamental to assessing its viability as a structural component. From the sawmill to the construction site, as structural timber dries, distortion can render it unsuitable for its intended purposes. This rejection of unusable timber is a considerable source of waste to the timber industry and the wider construction sector. As such, ensuring accurate measurement of distortion is a key step in addressing ineffciencies within timber processing. Currently, the FRITS frame method is the established approach used to gain an understanding of timber surface profile. The method, while reliable, is dependent upon relatively few measurements taken across a limited area of the overall surface, with a great deal of interpolation required. Further, the process is unavoidably slow and cumbersome, the immobile scanning equipment limiting where and when measurements can be taken and constricting the process as a whole. This thesis seeks to introduce LiDAR scanning as a new, alternative approach to distortion feature measurement. In its infancy as a measurement technique within timber research, the practicalities of using LiDAR scanning as a measurement method are herein demonstrated, exploiting many of the advantages the technology has over current approaches. LiDAR scanning creates a much more comprehensive image of a timber surface, generating input data multiple magnitudes larger than that of the FRITS frame. Set-up and scanning time for LiDAR is also much quicker and more flexible than existing methods. With LiDAR scanning the measurement process is freed from many of the constraints of the FRITS frame and can be done in almost any environment. For this thesis, surface scans were carried out on seven Sitka spruce samples of dimensions 48.5x102x3000mm using both the FRITS frame and LiDAR scanner. The samples used presented marked levels of distortion and were relatively free from knots. A computational measurement model was created to extract feature measurements from the raw LiDAR data, enabling an assessment of each piece of timber to be carried out in accordance with existing standards. Assessment of distortion features focused primarily on the measurement of twist due to its strong prevalence in spruce and the considerable concern it generates within the construction industry. Additional measurements of surface inclination and bow were also made with each method to further establish LiDAR's credentials as a viable alternative. Overall, feature measurements as generated by the new LiDAR method compared well with those of the established FRITS method. From these investigations recommendations were made to address inadequacies within existing measurement standards, namely their reliance on generalised and interpretative descriptions of distortion. The potential for further uses of LiDAR scanning within timber researches was also discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Insights into the genomic adaptive traits of Treponema pallidum, the causative bacterium of syphilis, have long been hampered due to the absence of in vitro culture models and the constraints associated with its propagation in rabbits. Here, we have bypassed the culture bottleneck by means of a targeted strategy never applied to uncultivable bacterial human pathogens to directly capture whole-genome T. pallidum data in the context of human infection. This strategy has unveiled a scenario of discreet T. pallidum interstrain single-nucleotide-polymorphism-based microevolution, contrasting with a rampant within-patient genetic heterogeneity mainly targeting multiple phase-variable loci and a major antigen-coding gene (tprK). TprK demonstrated remarkable variability and redundancy, intra- and interpatient, suggesting ongoing parallel adaptive diversification during human infection. Some bacterial functions (for example, flagella- and chemotaxis-associated) were systematically targeted by both inter- and intrastrain single nucleotide polymorphisms, as well as by ongoing within-patient phase variation events. Finally, patient-derived genomes possess mutations targeting a penicillin-binding protein coding gene (mrcA) that had never been reported, unveiling it as a candidate target to investigate the impact on the susceptibility to penicillin. Our findings decode the major genetic mechanisms by which T. pallidum promotes immune evasion and survival, and demonstrate the exceptional power of characterizing evolving pathogen subpopulations during human infection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Presenteeism consists in going to work without conditions to produce, which can have a much higher impact than absenteeism on the productivity of an organisation. Presenteeism translates in both physical as psychological perturbations. It is a difficult to quantify reality, as is its translation into direct and indirect costs within the organisation. Our goal was to analyse the effects of presenteeism on the productivity of a company in the food-procession sector through a descriptive and transversal study of exploratory nature. The Stanford Presenteeism Scale SPS-6 (validated by Ferreira et al, 2010) and a semi-structured interview were used. Most of the workers referred having already gone to work feeling ill at least two days in the last year, mentioning that their health condition affected their performance, made them feel desperate and lacking pleasure from work. Management mentioned that presenteeism has a direct impact on productivity without, however, being able to quantify the true costs. Presenteeism is a reality in organisational scenarios, exceling in the educational and health sectors. We underline the importance of making organisations aware of the psychosocial risks and the importance of having healthy leaderships, work stress control and the presence of clinical psychologists and professional coaches.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The four-skills on tests for young native speakers commonly do not generate correlation incongruency concerning the cognitive strategies frequently reported. Considering the non-native speakers there are parse evidence to determine which tasks are important to assess properly the cognitive and academic language proficiency (Cummins, 1980; 2012). Research questions: It is of high probability that young students with origin in immigration significantly differ on their communication strategies and skills in a second language processing context (1); attached to this first assumption, it is supposed that teachers significantly differ depending on their scientific area and previous training (2). Purpose: This study intends to examine whether school teachers (K-12) as having different origin in scientific domain of teaching and training perceive differently an adapted four-skills scale, in European Portuguese. Research methods: 77 teachers of five areas scientific areas, mean of teaching year service = 32 (SD= 2,7), 57 males and 46 females (from basic and high school levels). Main findings: ANOVA (Effect size and Post-hoc Tukey tests) and linear regression analysis (stepwise method) revealed statistically significant differences among teachers of different areas, mainly between language teachers and science teachers. Language teachers perceive more accurately tasks in a multiple manner to the broad skills that require to be measured in non-native students. Conclusion: If teachers perceive differently the importance of the big-four tasks, there would be incongruence on skills measurement that teachers select for immigrant puppils. Non-balanced tasks and the teachers’ perceptions on evaluation and toward competence of students would likely determine limitations for academic and cognitive development of non-native students. Furthermore, results showed sufficient evidence to conclude that tasks are perceived differently by teachers toward importance of specific skills subareas. Reading skills are best considered compared to oral comphreension skills in non-native students.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Póster presentado en: 21st World Hydrogen Energy Conference 2016. Zaragoza, Spain. 13-16th June, 2016

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Strawberries harvested for processing as frozen fruits are currently de-calyxed manually in the field. This process requires the removal of the stem cap with green leaves (i.e. the calyx) and incurs many disadvantages when performed by hand. Not only does it necessitate the need to maintain cutting tool sanitation, but it also increases labor time and exposure of the de-capped strawberries before in-plant processing. This leads to labor inefficiency and decreased harvest yield. By moving the calyx removal process from the fields to the processing plants, this new practice would reduce field labor and improve management and logistics, while increasing annual yield. As labor prices continue to increase, the strawberry industry has shown great interest in the development and implementation of an automated calyx removal system. In response, this dissertation describes the design, operation, and performance of a full-scale automatic vision-guided intelligent de-calyxing (AVID) prototype machine. The AVID machine utilizes commercially available equipment to produce a relatively low cost automated de-calyxing system that can be retrofitted into existing food processing facilities. This dissertation is broken up into five sections. The first two sections include a machine overview and a 12-week processing plant pilot study. Results of the pilot study indicate the AVID machine is able to de-calyx grade-1-with-cap conical strawberries at roughly 66 percent output weight yield at a throughput of 10,000 pounds per hour. The remaining three sections describe in detail the three main components of the machine: a strawberry loading and orientation conveyor, a machine vision system for calyx identification, and a synchronized multi-waterjet knife calyx removal system. In short, the loading system utilizes rotational energy to orient conical strawberries. The machine vision system determines cut locations through RGB real-time feature extraction. The high-speed multi-waterjet knife system uses direct drive actuation to locate 30,000 psi cutting streams to precise coordinates for calyx removal. Based on the observations and studies performed within this dissertation, the AVID machine is seen to be a viable option for automated high-throughput strawberry calyx removal. A summary of future tasks and further improvements is discussed at the end.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

New constraints on isotope fractionation factors in inorganic aqueous sulfur systems based on theoretical and experimental techniques relevant to studies of the sulfur cycle in modern environments and the geologic rock record are presented in this dissertation. These include theoretical estimations of equilibrium isotope fractionation factors utilizing quantum mechanical software and a water cluster model approach for aqueous sulfur compounds that span the entire range of oxidation state for sulfur. These theoretical calculations generally reproduce the available experimental determinations from the literature and provide new constraints where no others are available. These theoretical calculations illustrate in detail the relationship between sulfur bonding environment and the mass dependence associated with equilibrium isotope exchange reactions involving all four isotopes of sulfur. I additionally highlight the effect of isomers of protonated compounds (compounds with the same chemical formula but different structure, where protons are bound to either sulfur or oxygen atoms) on isotope partitioning in the sulfite (S4+) and sulfoxylate (S2+) systems, both of which are key intermediates in oxidation-reduction processes in the sulfur cycle. I demonstrate that isomers containing the highest degree of coordination around sulfur (where protonation occurs on the sulfur atom) have a strong influence on isotopic fractionation factors, and argue that isomerization phenomenon should be considered in models of the sulfur cycle. Additionally, experimental results of the reaction rates and isotope fractionations associated with the chemical oxidation of aqueous sulfide are presented. Sulfide oxidation is a major process in the global sulfur cycle due largely to the sulfide-producing activity of anaerobic microorganisms in organic-rich marine sediments. These experiments reveal relationships between isotope fractionations and reaction rate as a function of both temperature and trace metal (ferrous iron) catalysis that I interpret in the context of the complex mechanism of sulfide oxidation. I also demonstrate that sulfide oxidation is a process associated with a mass dependence that can be described as not conforming to the mass dependence typically associated with equilibrium isotope exchange. This observation has implications for the inclusion of oxidative processes in environmental- and global-scale models of the sulfur cycle based on the mass balance of all four isotopes of sulfur. The contents of this dissertation provide key reference information on isotopic fractionation factors in aqueous sulfur systems that will have far-reaching applicability to studies of the sulfur cycle in a wide variety of natural settings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work, we further extend the recently developed adaptive data analysis method, the Sparse Time-Frequency Representation (STFR) method. This method is based on the assumption that many physical signals inherently contain AM-FM representations. We propose a sparse optimization method to extract the AM-FM representations of such signals. We prove the convergence of the method for periodic signals under certain assumptions and provide practical algorithms specifically for the non-periodic STFR, which extends the method to tackle problems that former STFR methods could not handle, including stability to noise and non-periodic data analysis. This is a significant improvement since many adaptive and non-adaptive signal processing methods are not fully capable of handling non-periodic signals. Moreover, we propose a new STFR algorithm to study intrawave signals with strong frequency modulation and analyze the convergence of this new algorithm for periodic signals. Such signals have previously remained a bottleneck for all signal processing methods. Furthermore, we propose a modified version of STFR that facilitates the extraction of intrawaves that have overlaping frequency content. We show that the STFR methods can be applied to the realm of dynamical systems and cardiovascular signals. In particular, we present a simplified and modified version of the STFR algorithm that is potentially useful for the diagnosis of some cardiovascular diseases. We further explain some preliminary work on the nature of Intrinsic Mode Functions (IMFs) and how they can have different representations in different phase coordinates. This analysis shows that the uncertainty principle is fundamental to all oscillating signals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

New and promising treatments for coronary heart disease are enabled by vascular scaffolds made of poly(L-lactic acid) (PLLA), as demonstrated by Abbott Vascular’s bioresorbable vascular scaffold. PLLA is a semicrystalline polymer whose degree of crystallinity and crystalline microstructure depend on the thermal and deformation history during processing. In turn, the semicrystalline morphology determines scaffold strength and biodegradation time. However, spatially-resolved information about the resulting material structure (crystallinity and crystal orientation) is needed to interpret in vivo observations.

The first manufacturing step of the scaffold is tube expansion in a process similar to injection blow molding. Spatial uniformity of the tube microstructure is essential for the consistent production and performance of the final scaffold. For implantation into the artery, solid-state deformation below the glass transition temperature is imposed on a laser-cut subassembly to crimp it into a small diameter. Regions of localized strain during crimping are implicated in deployment behavior.

To examine the semicrystalline microstructure development of the scaffold, we employed complementary techniques of scanning electron and polarized light microscopy, wide-angle X-ray scattering, and X-ray microdiffraction. These techniques enabled us to assess the microstructure at the micro and nano length scale. The results show that the expanded tube is very uniform in the azimuthal and axial directions and that radial variations are more pronounced. The crimping step dramatically changes the microstructure of the subassembly by imposing extreme elongation and compression. Spatial information on the degree and direction of chain orientation from X-ray microdiffraction data gives insight into the mechanism by which the PLLA dissipates the stresses during crimping, without fracture. Finally, analysis of the microstructure after deployment shows that it is inherited from the crimping step and contributes to the scaffold’s successful implantation in vivo.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

By providing vehicle-to-vehicle and vehicle-to-infrastructure wireless communications, vehicular ad hoc networks (VANETs), also known as the “networks on wheels”, can greatly enhance traffic safety, traffic efficiency and driving experience for intelligent transportation system (ITS). However, the unique features of VANETs, such as high mobility and uneven distribution of vehicular nodes, impose critical challenges of high efficiency and reliability for the implementation of VANETs. This dissertation is motivated by the great application potentials of VANETs in the design of efficient in-network data processing and dissemination. Considering the significance of message aggregation, data dissemination and data collection, this dissertation research targets at enhancing the traffic safety and traffic efficiency, as well as developing novel commercial applications, based on VANETs, following four aspects: 1) accurate and efficient message aggregation to detect on-road safety relevant events, 2) reliable data dissemination to reliably notify remote vehicles, 3) efficient and reliable spatial data collection from vehicular sensors, and 4) novel promising applications to exploit the commercial potentials of VANETs. Specifically, to enable cooperative detection of safety relevant events on the roads, the structure-less message aggregation (SLMA) scheme is proposed to improve communication efficiency and message accuracy. The scheme of relative position based message dissemination (RPB-MD) is proposed to reliably and efficiently disseminate messages to all intended vehicles in the zone-of-relevance in varying traffic density. Due to numerous vehicular sensor data available based on VANETs, the scheme of compressive sampling based data collection (CS-DC) is proposed to efficiently collect the spatial relevance data in a large scale, especially in the dense traffic. In addition, with novel and efficient solutions proposed for the application specific issues of data dissemination and data collection, several appealing value-added applications for VANETs are developed to exploit the commercial potentials of VANETs, namely general purpose automatic survey (GPAS), VANET-based ambient ad dissemination (VAAD) and VANET based vehicle performance monitoring and analysis (VehicleView). Thus, by improving the efficiency and reliability in in-network data processing and dissemination, including message aggregation, data dissemination and data collection, together with the development of novel promising applications, this dissertation will help push VANETs further to the stage of massive deployment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Gasarite structures are a unique type of metallic foam containing tubular pores. The original methods for their production limited them to laboratory study despite appealing foam properties. Thermal decomposition processing of gasarites holds the potential to increase the application of gasarite foams in engineering design by removing several barriers to their industrial scale production. The following study characterized thermal decomposition gasarite processing both experimentally and theoretically. It was found that significant variation was inherent to this process therefore several modifications were necessary to produce gasarites using this method. Conventional means to increase porosity and enhance pore morphology were studied. Pore morphology was determined to be more easily replicated if pores were stabilized by alumina additions and powders were dispersed evenly. In order to better characterize processing, high temperature and high ramp rate thermal decomposition data were gathered. It was found that the high ramp rate thermal decomposition behavior of several hydrides was more rapid than hydride kinetics at low ramp rates. This data was then used to estimate the contribution of several pore formation mechanisms to the development of pore structure. It was found that gas-metal eutectic growth can only be a viable pore formation mode if non-equilibrium conditions persist. Bubble capture cannot be a dominant pore growth mode due to high bubble terminal velocities. Direct gas evolution appears to be the most likely pore formation mode due to high gas evolution rate from the decomposing particulate and microstructural pore growth trends. The overall process was evaluated for its economic viability. It was found that thermal decomposition has potential for industrialization, but further refinements are necessary in order for the process to be viable.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Analyzing large-scale gene expression data is a labor-intensive and time-consuming process. To make data analysis easier, we developed a set of pipelines for rapid processing and analysis poplar gene expression data for knowledge discovery. Of all pipelines developed, differentially expressed genes (DEGs) pipeline is the one designed to identify biologically important genes that are differentially expressed in one of multiple time points for conditions. Pathway analysis pipeline was designed to identify the differentially expression metabolic pathways. Protein domain enrichment pipeline can identify the enriched protein domains present in the DEGs. Finally, Gene Ontology (GO) enrichment analysis pipeline was developed to identify the enriched GO terms in the DEGs. Our pipeline tools can analyze both microarray gene data and high-throughput gene data. These two types of data are obtained by two different technologies. A microarray technology is to measure gene expression levels via microarray chips, a collection of microscopic DNA spots attached to a solid (glass) surface, whereas high throughput sequencing, also called as the next-generation sequencing, is a new technology to measure gene expression levels by directly sequencing mRNAs, and obtaining each mRNA’s copy numbers in cells or tissues. We also developed a web portal (http://sys.bio.mtu.edu/) to make all pipelines available to public to facilitate users to analyze their gene expression data. In addition to the analyses mentioned above, it can also perform GO hierarchy analysis, i.e. construct GO trees using a list of GO terms as an input.