920 resultados para Processing and sinterization


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Nonlinear optics is a broad field of research and technology that encompasses subject matter in the field of Physics, Chemistry, and Engineering. It is the branch of Optics that describes the behavior of light in nonlinear media, that is, media in which the dielectric polarization P responds nonlinearly to the electric field E of the light. This nonlinearity is typically only observed at very high light intensities. This area has applications in all optical and electro optical devices used for communication, optical storage and optical computing. Many nonlinear optical effects have proved to be versatile probes for understanding basic and applied problems. Nonlinear optical devices use nonlinear dependence of refractive index or absorption coefficient on the applied field. These nonlinear optical devices are passive devices and are referred to as intelligent or smart materials owing to the fact that the sensing, processing and activating functions required for optical processes are inherent to them which are otherwise separate in dynamic devices.The large interest in nonlinear optical crystalline materials has been motivated by their potential use in the fabrication of all-optical photonic devices. Transparent crystalline materials can exhibit different kinds of optical nonlinearities which are associated with a nonlinear polarization. The choice of the most suitable crystal material for a given application is often far from trivial; it should involve the consideration of many aspects. A high nonlinearity for frequency conversion of ultra-short pulses does not help if the interaction length is strongly limited by a large group velocity mismatch and the low damage threshold limits the applicable optical intensities. Also, it can be highly desirable to use a crystal material which can be critically phasematched at room temperature. Among the different types of nonlinear crystals, metal halides and tartrates have attracted due to their importance in photonics. Metal halides like lead halides have drawn attention because they exhibit interesting features from the stand point of the electron-lattice interaction .These materials are important for their luminescent properties. Tartrate single crystals show many interesting physical properties such as ferroelectric, piezoelectric, dielectric and optical characteristics. They are used for nonlinear optical devices based on their optical transmission characteristics. Among the several tartrate compounds, Strontium tartrate, Calcium tartrate and Cadmium tartrate have received greater attention on account of their ferroelectric, nonlinear optical and spectral characteristics. The present thesis reports the linear and nonlinear aspects of these crystals and their potential applications in the field of photonics.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Cashew is an important commodity traded across the continents and the world cashew industry is the livelihood of more than three million people worldwide, the majority of whom are womenfolk from the socially and economically backward community of the developing nations. Cashew tree was originally planted to prevent soil erosion and it was during the beginning of the 19th century that cashew kernels attained the status of a food item. Further, the cashew kernels attained the status of an international commodity with India exporting its first consignment of cashew kernels to U.S.A. in 1920. India was the first country to hit the world market with cashew as a commodity and it was she who pioneered cashew processing as an industry. For decades together India was enjoying a monopoly in the world cashew industry in the fields of raw nut production (cultivation), processing and the market share in the international trade. The liberalisation of international trade has brought in a big transition in the world of cashew. India started to benefit from the trade policy, that improved her supply positions of raw nuts from other producing countries, accelerated her growth in processing of raw nuts and exports of cashew kernels. On the other side, her domestic consumption started growing up that by the beginning of the new century, she emerged out as the world’s largest consumer of cashew kernels as well.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Au cours des dernières décennies, l’effort sur les applications de capteurs infrarouges a largement progressé dans le monde. Mais, une certaine difficulté demeure, en ce qui concerne le fait que les objets ne sont pas assez clairs ou ne peuvent pas toujours être distingués facilement dans l’image obtenue pour la scène observée. L’amélioration de l’image infrarouge a joué un rôle important dans le développement de technologies de la vision infrarouge de l’ordinateur, le traitement de l’image et les essais non destructifs, etc. Cette thèse traite de la question des techniques d’amélioration de l’image infrarouge en deux aspects, y compris le traitement d’une seule image infrarouge dans le domaine hybride espacefréquence, et la fusion d’images infrarouges et visibles employant la technique du nonsubsampled Contourlet transformer (NSCT). La fusion d’images peut être considérée comme étant la poursuite de l’exploration du modèle d’amélioration de l’image unique infrarouge, alors qu’il combine les images infrarouges et visibles en une seule image pour représenter et améliorer toutes les informations utiles et les caractéristiques des images sources, car une seule image ne pouvait contenir tous les renseignements pertinents ou disponibles en raison de restrictions découlant de tout capteur unique de l’imagerie. Nous examinons et faisons une enquête concernant le développement de techniques d’amélioration d’images infrarouges, et ensuite nous nous consacrons à l’amélioration de l’image unique infrarouge, et nous proposons un schéma d’amélioration de domaine hybride avec une méthode d’évaluation floue de seuil amélioré, qui permet d’obtenir une qualité d’image supérieure et améliore la perception visuelle humaine. Les techniques de fusion d’images infrarouges et visibles sont établies à l’aide de la mise en oeuvre d’une mise en registre précise des images sources acquises par différents capteurs. L’algorithme SURF-RANSAC est appliqué pour la mise en registre tout au long des travaux de recherche, ce qui conduit à des images mises en registre de façon très précise et des bénéfices accrus pour le traitement de fusion. Pour les questions de fusion d’images infrarouges et visibles, une série d’approches avancées et efficaces sont proposés. Une méthode standard de fusion à base de NSCT multi-canal est présente comme référence pour les approches de fusion proposées suivantes. Une approche conjointe de fusion, impliquant l’Adaptive-Gaussian NSCT et la transformée en ondelettes (Wavelet Transform, WT) est propose, ce qui conduit à des résultats de fusion qui sont meilleurs que ceux obtenus avec les méthodes non-adaptatives générales. Une approche de fusion basée sur le NSCT employant la détection comprime (CS, compressed sensing) et de la variation totale (TV) à des coefficients d’échantillons clairsemés et effectuant la reconstruction de coefficients fusionnés de façon précise est proposée, qui obtient de bien meilleurs résultats de fusion par le biais d’une pré-amélioration de l’image infrarouge et en diminuant les informations redondantes des coefficients de fusion. Une procédure de fusion basée sur le NSCT utilisant une technique de détection rapide de rétrécissement itératif comprimé (fast iterative-shrinking compressed sensing, FISCS) est proposée pour compresser les coefficients décomposés et reconstruire les coefficients fusionnés dans le processus de fusion, qui conduit à de meilleurs résultats plus rapidement et d’une manière efficace.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

[EN]Aeolian dust plays an important role in climate and ocean processes. Particularly, Saharan dust deposition is of importance in the Canary Current due to its content of iron minerals, which are fertilizers of the ocean. In this work, dust particles are characterized mainly by granulometry, morphometry and mineralogy, using image processing and scanning northern Mauritania and the Western Sahara. The concentration of terrigenous material was measured in three environments: the atmosphere (300 m above sea level), the mixed layer at 10 m depth, and 150 m depth. Samples were collected before and during the dust events, thus allowing the effect of Saharan dust inputs in the water column to be assessed. The dominant grain size was coarse silt

Relevância:

90.00% 90.00%

Publicador:

Resumo:

[EN] Aeolian dust plays an important role in climate and ocean processes. Particularly, Saharan dust deposition is of importance in the Canary Current due to its content of iron minerals, which are fertilizers of the ocean. In this work, dust particles are characterized mainly by granulometry, morphometry and mineralogy, using image processing and scanning northern Mauritania and the Western Sahara. The concentration of terrigenous material was measured in three environments: the atmosphere (300 m above sea level), the mixed layer at 10 m depth, and 150 m depth. Samples were collected before and during the dust events, thus allowing the effect of Saharan dust inputs in the water column to be assessed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In the casting of metals, tundish flow, welding, converters, and other metal processing applications, the behaviour of the fluid surface is important. In aluminium alloys, for example, oxides formed on the surface may be drawn into the body of the melt where they act as faults in the solidified product affecting cast quality. For this reason, accurate description of wave behaviour, air entrapment, and other effects need to be modelled, in the presence of heat transfer and possibly phase change. The authors have developed a single-phase algorithm for modelling this problem. The Scalar Equation Algorithm (SEA) (see Refs. 1 and 2), enables the transport of the property discontinuity representing the free surface through a fixed grid. An extension of this method to unstructured mesh codes is presented here, together with validation. The new method employs a TVD flux limiter in conjunction with a ray-tracing algorithm, to ensure a sharp bound interface. Applications of the method are in the filling and emptying of mould cavities, with heat transfer and phase change.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Metal casting is a process governed by the interaction of a range of physical phenomena. Most computational models of this process address only what are conventionally regarded as the primary phenomena – heat conduction and solidification. However, to predict other phenomena, such as porosity formation, requires modelling the interaction of the fluid flow, heat transfer, solidification and the development of stressdeformation in the solidified part of the casting. This paper will describe a modelling framework called PHYSICA[1] which has the capability to stimulate such multiphysical phenomena.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-06

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Obesity is a major challenge to human health worldwide. Little is known about the brain mechanisms that are associated with overeating and obesity in humans. In this project, multimodal neuroimaging techniques were utilized to study brain neurotransmission and anatomy in obesity. Bariatric surgery was used as an experimental method for assessing whether the possible differences between obese and non-obese individuals change following the weight loss. This could indicate whether obesity-related altered neurotransmission and cerebral atrophy are recoverable or whether they represent stable individual characteristics. Morbidly obese subjects (BMI ≥ 35 kg/m2) and non-obese control subjects (mean BMI 23 kg/m2) were studied with positron emission tomography (PET) and magnetic resonance imaging (MRI). In the PET studies, focus was put on dopaminergic and opioidergic systems, both of which are crucial in the reward processing. Brain dopamine D2 receptor (D2R) availability was measured using [11C]raclopride and µ-opioid receptor (MOR) availability using [11C]carfentanil. In the MRI studies, voxel-based morphometry (VBM) of T1-weighted MRI images was used, coupled with diffusion tensor imaging (DTI). Obese subjects underwent bariatric surgery as their standard clinical treatment during the study. Preoperatively, morbidly obese subjects had significantly lower MOR availability but unaltered D2R availability in several brain regions involved in reward processing, including striatum, insula, and thalamus. Moreover, obesity disrupted the interaction between the MOR and D2R systems in ventral striatum. Bariatric surgery and concomitant weight loss normalized MOR availability in the obese, but did not influence D2R availability in any brain region. Morbidly obese subjects had also significantly lower grey and white matter densities globally in the brain, but more focal changes were located in the areas associated with inhibitory control, reward processing, and appetite. DTI revealed also signs of axonal damage in the obese in corticospinal tracts and occipito-frontal fascicles. Surgery-induced weight loss resulted in global recovery of white matter density as well as more focal recovery of grey matter density among obese subjects. Altogether these results show that the endogenous opioid system is fundamentally linked to obesity. Lowered MOR availability is likely a consequence of obesity and may mediate maintenance of excessive energy uptake. In addition, obesity has adverse effects on brain structure. Bariatric surgery however reverses MOR dysfunction and recovers cerebral atrophy. Understanding the opioidergic contribution to overeating and obesity is critical for developing new psychological or pharmacological treatments for obesity. The actual molecular mechanisms behind the positive change in structure and neurotransmitter function still remain unclear and should be addressed in the future research.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Current hearing-assistive technology performs poorly in noisy multi-talker conditions. The goal of this thesis was to establish the feasibility of using EEG to guide acoustic processing in such conditions. To attain this goal, this research developed a model via the constructive research method, relying on literature review. Several approaches have revealed improvements in the performance of hearing-assistive devices under multi-talker conditions, namely beamforming spatial filtering, model-based sparse coding shrinkage, and onset enhancement of the speech signal. Prior research has shown that electroencephalography (EEG) signals contain information that concerns whether the person is actively listening, what the listener is listening to, and where the attended sound source is. This thesis constructed a model for using EEG information to control beamforming, model-based sparse coding shrinkage, and onset enhancement of the speech signal. The purpose of this model is to propose a framework for using EEG signals to control sound processing to select a single talker in a noisy environment containing multiple talkers speaking simultaneously. On a theoretical level, the model showed that EEG can control acoustical processing. An analysis of the model identified a requirement for real-time processing and that the model inherits the computationally intensive properties of acoustical processing, although the model itself is low complexity placing a relatively small load on computational resources. A research priority is to develop a prototype that controls hearing-assistive devices with EEG. This thesis concludes highlighting challenges for future research.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In today’s big data world, data is being produced in massive volumes, at great velocity and from a variety of different sources such as mobile devices, sensors, a plethora of small devices hooked to the internet (Internet of Things), social networks, communication networks and many others. Interactive querying and large-scale analytics are being increasingly used to derive value out of this big data. A large portion of this data is being stored and processed in the Cloud due the several advantages provided by the Cloud such as scalability, elasticity, availability, low cost of ownership and the overall economies of scale. There is thus, a growing need for large-scale cloud-based data management systems that can support real-time ingest, storage and processing of large volumes of heterogeneous data. However, in the pay-as-you-go Cloud environment, the cost of analytics can grow linearly with the time and resources required. Reducing the cost of data analytics in the Cloud thus remains a primary challenge. In my dissertation research, I have focused on building efficient and cost-effective cloud-based data management systems for different application domains that are predominant in cloud computing environments. In the first part of my dissertation, I address the problem of reducing the cost of transactional workloads on relational databases to support database-as-a-service in the Cloud. The primary challenges in supporting such workloads include choosing how to partition the data across a large number of machines, minimizing the number of distributed transactions, providing high data availability, and tolerating failures gracefully. I have designed, built and evaluated SWORD, an end-to-end scalable online transaction processing system, that utilizes workload-aware data placement and replication to minimize the number of distributed transactions that incorporates a suite of novel techniques to significantly reduce the overheads incurred both during the initial placement of data, and during query execution at runtime. In the second part of my dissertation, I focus on sampling-based progressive analytics as a means to reduce the cost of data analytics in the relational domain. Sampling has been traditionally used by data scientists to get progressive answers to complex analytical tasks over large volumes of data. Typically, this involves manually extracting samples of increasing data size (progressive samples) for exploratory querying. This provides the data scientists with user control, repeatable semantics, and result provenance. However, such solutions result in tedious workflows that preclude the reuse of work across samples. On the other hand, existing approximate query processing systems report early results, but do not offer the above benefits for complex ad-hoc queries. I propose a new progressive data-parallel computation framework, NOW!, that provides support for progressive analytics over big data. In particular, NOW! enables progressive relational (SQL) query support in the Cloud using unique progress semantics that allow efficient and deterministic query processing over samples providing meaningful early results and provenance to data scientists. NOW! enables the provision of early results using significantly fewer resources thereby enabling a substantial reduction in the cost incurred during such analytics. Finally, I propose NSCALE, a system for efficient and cost-effective complex analytics on large-scale graph-structured data in the Cloud. The system is based on the key observation that a wide range of complex analysis tasks over graph data require processing and reasoning about a large number of multi-hop neighborhoods or subgraphs in the graph; examples include ego network analysis, motif counting in biological networks, finding social circles in social networks, personalized recommendations, link prediction, etc. These tasks are not well served by existing vertex-centric graph processing frameworks whose computation and execution models limit the user program to directly access the state of a single vertex, resulting in high execution overheads. Further, the lack of support for extracting the relevant portions of the graph that are of interest to an analysis task and loading it onto distributed memory leads to poor scalability. NSCALE allows users to write programs at the level of neighborhoods or subgraphs rather than at the level of vertices, and to declaratively specify the subgraphs of interest. It enables the efficient distributed execution of these neighborhood-centric complex analysis tasks over largescale graphs, while minimizing resource consumption and communication cost, thereby substantially reducing the overall cost of graph data analytics in the Cloud. The results of our extensive experimental evaluation of these prototypes with several real-world data sets and applications validate the effectiveness of our techniques which provide orders-of-magnitude reductions in the overheads of distributed data querying and analysis in the Cloud.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this dissertation, there are developed different analytical strategies to discover and characterize mammalian brain peptides using small amount of tissues. The magnocellular neurons of rat supraoptic nucleus in tissue and cell culture served as the main model to study neuropeptides, in addition to hippocampal neurons and mouse embryonic pituitaries. The neuropeptidomcis studies described here use different extraction methods on tissue or cell culture combined with mass spectrometry (MS) techniques, matrix-assisted laser desorption/ionization (MALDI) and electrospray ionization (ESI). These strategies lead to the identification of multiple peptides from the rat/mouse brain in tissue and cell cultures, including novel compounds One of the goals in this dissertation was to optimize sample preparations on samples isolated from well-defined brain regions for mass spectrometric analysis. Here, the neuropeptidomics study of the SON resulted in the identification of 85 peptides, including 20 unique peptides from known prohormones. This study includes mass spectrometric analysis even from individually isolated magnocellular neuroendocrine cells, where vasopressin and several other peptides are detected. At the same time, it was shown that the same approach could be applied to analyze peptides isolated from a similar hypothalamic region, the suprachiasmatic nucleus (SCN). Although there were some overlaps regarding the detection of the peptides in the two brain nuclei, different peptides were detected specific to each nucleus. Among other peptides, provasopressin fragments were specifically detected in the SON while angiotensin I, somatostatin-14, neurokinin B, galanin, and vasoactive-intestinal peptide (VIP) were detected in the SCN only. Lists of peptides were generated from both brain regions for comparison of the peptidome of SON and SCN nuclei. Moving from analysis of magnocellular neurons in tissue to cell culture, the direct peptidomics of the magnocellular and hippocampal neurons led to the detection of 10 peaks that were assigned to previously characterized peptides and 17 peaks that remain unassigned. Peptides from the vasopressin prohormone and secretogranin-2 are attributed to magnocellular neurons, whereas neurokinin A, peptide J, and neurokinin B are attributed to cultured hippocampal neurons. This approach enabled the elucidation of cell-specific prohormone processing and the discovery of cell-cell signaling peptides. The peptides with roles in the development of the pituitary were analyzed using transgenic mice. Hes1 KO is a genetically modified mouse that lives only e18.5 (embryonic days). Anterior pituitaries of Hes1 null mice exhibit hypoplasia due to increased cell death and reduced proliferation and in the intermediate lobe, the cells differentiate abnormally into somatotropes instead of melanotropes. These previous findings demonstrate that Hes1 has multiple roles in pituitary development, cell differentiation, and cell fate. AVP was detected in all samples. Interestingly, somatostatin [92-100] and provasopressin [151-168] were detected in the mutant but not in the wild type or heterozygous pituitaries while somatostatin-14 was detected only in the heterozygous pituitary. In addition, the putative peptide corresponding to m/z 1330.2 and POMC [205-222] are detected in the mutant and heterozygous pituitaries, but not in the wild type. These results indicate that Hes1 influences the processing of different prohormones having possible roles during development and opens new directions for further developmental studies. This research demonstrates the robust capabilities of MS, which ensures the unbiased direct analysis of peptides extracted from complex biological systems and allows addressing important questions to understand cell-cell signaling in the brain.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Theories of sparse signal representation, wherein a signal is decomposed as the sum of a small number of constituent elements, play increasing roles in both mathematical signal processing and neuroscience. This happens despite the differences between signal models in the two domains. After reviewing preliminary material on sparse signal models, I use work on compressed sensing for the electron tomography of biological structures as a target for exploring the efficacy of sparse signal reconstruction in a challenging application domain. My research in this area addresses a topic of keen interest to the biological microscopy community, and has resulted in the development of tomographic reconstruction software which is competitive with the state of the art in its field. Moving from the linear signal domain into the nonlinear dynamics of neural encoding, I explain the sparse coding hypothesis in neuroscience and its relationship with olfaction in locusts. I implement a numerical ODE model of the activity of neural populations responsible for sparse odor coding in locusts as part of a project involving offset spiking in the Kenyon cells. I also explain the validation procedures we have devised to help assess the model's similarity to the biology. The thesis concludes with the development of a new, simplified model of locust olfactory network activity, which seeks with some success to explain statistical properties of the sparse coding processes carried out in the network.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We study spatially localized states of a spiking neuronal network populated by a pulse coupled phase oscillator known as the lighthouse model. We show that in the limit of slow synaptic interactions in the continuum limit the dynamics reduce to those of the standard Amari model. For non-slow synaptic connections we are able to go beyond the standard firing rate analysis of localized solutions allowing us to explicitly construct a family of co-existing one-bump solutions, and then track bump width and firing pattern as a function of system parameters. We also present an analysis of the model on a discrete lattice. We show that multiple width bump states can co-exist and uncover a mechanism for bump wandering linked to the speed of synaptic processing. Moreover, beyond a wandering transition point we show that the bump undergoes an effective random walk with a diffusion coefficient that scales exponentially with the rate of synaptic processing and linearly with the lattice spacing.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this project, have been studied to determine the appropriate model to spatial, temporal and diversity of demersal fishes in the Sea of Oman, including Trichiuridae, Nemipteridae, Haemulidae, Arridae, Synodontidae, Batoidfishes, Carangidae, Scianidae, Carchariniformes and Serranidae. This research became operational from catch data during 2003 to 2013 (in 2007, due to the lack of ship failed). Processing and calculations was evaluated by using the software Excel, SPSS, Arc GIS and table curve 3D highest biomass and abundance was showed in strata A and C and 10-30 m depth layers was showed the best condition biomass. In other words, highest biomass was showed in the eastern region in the Oman Sea than the central and western regions. Batoidfishes and Trichiuridae had the highest biomass .Depth factors was showed a significant correlation with the biomass. Scianidae, Serranidae and Haemulidae were showed a large decline. Synodontidae was showed a very large increase. The largest of Shannon index belong to central and western region of the Oman Sea. The highest Shannon index was showed 10-20 and 50-100 m, respectively. The Distribution maps based on the biomass was analyzed by using Arc GIS software. So that were identified in the first time in a ten-year period and carefully catch stations any economic of aquatic group. In conclusion, the depth can be found in the pattern of distribution, abundance and diversity of fish from away the beach so that follow specific pattern.