971 resultados para Homogeneous Kernels


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The ability to test large arrays of cell and biomaterial combinations in 3D environments is still rather limited in the context of tissue engineering and regenerative medicine. This limitation can be generally addressed by employing highly automated and reproducible methodologies. This study reports on the development of a highly versatile and upscalable method based on additive manufacturing for the fabrication of arrays of scaffolds, which are enclosed into individualized perfusion chambers. Devices containing eight scaffolds and their corresponding bioreactor chambers are simultaneously fabricated utilizing a dual extrusion additive manufacturing system. To demonstrate the versatility of the concept, the scaffolds, while enclosed into the device, are subsequently surface-coated with a biomimetic calcium phosphate layer by perfusion with simulated body fluid solution. 96 scaffolds are simultaneously seeded and cultured with human osteoblasts under highly controlled bidirectional perfusion dynamic conditions over 4 weeks. Both coated and noncoated resulting scaffolds show homogeneous cell distribution and high cell viability throughout the 4 weeks culture period and CaP-coated scaffolds result in a significantly increased cell number. The methodology developed in this work exemplifies the applicability of additive manufacturing as a tool for further automation of studies in the field of tissue engineering and regenerative medicine.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Anterior cruciate ligament (ACL) tear is a common sports injury of the knee. Arthroscopic reconstruction using autogenous graft material is widely used for patients with ACL instability. The grafts most commonly used are the patellar and the hamstring tendons, by various fixation techniques. Although clinical evaluation and conventional radiography are routinely used in follow-up after ACL surgery, magnetic resonance imaging (MRI) plays an important role in the diagnosis of complications after ACL surgery. The aim of this thesis was to study the clinical outcome of patellar and hamstring tendon ACL reconstruction techniques. In addition, the postoperative appearance of the ACL graft was evaluated using several MRI sequences. Of the 175 patients who underwent an arthroscopically assisted ACL reconstruction, 99 patients were randomized into patellar tendon (n=51) or hamstring tendon (n=48) groups. In addition, 62 patients with hamstring graft ACL reconstruction were randomized into either cross-pin (n=31) or interference screw (n=31) fixation groups. Follow-up evaluation determined knee laxity, isokinetic muscle performance and several knee scores. Lateral and anteroposterior view radiographs were obtained. Several MRI sequences were obtained with a 1.5-T imager. The appearance and enhancement pattern of the graft and periligamentous tissue, and the location of bone tunnels were evaluated. After MRI, arthroscopy was performed on 14 symptomatic knees. The results revealed no significant differences in the 2-year outcome between the groups. In the hamstring tendon group, the average femoral and tibial bone tunnel diameter increased during 2 years follow-up by 33% and 23%, respectively. In the asymptomatic knees, the graft showed homogeneous and low signal intensity with periligamentous streaks of intermediate signal intensity on T2-weighted MR images. In the symptomatic knees, arthroscopy revealed 12 abnormal grafts and two meniscal tears, each with an intact graft. Among 3 lax grafts visible on arthroscopy, MRI showed an intact graft and improper bone tunnel placement. For diagnosing graft failure, all MRI findings combined gave a specificity of 90% and a sensitivity of 81%. In conclusion, all techniques appeared to improve patients' performance, and were therefore considered as good choices for ACL reconstruction. In follow-up, MRI permits direct evaluation of the ACL graft, the bone tunnels, and additional disorders of the knee. Bone tunnel enlargement and periligamentous tissue showing contrast enhancement were non-specific MRI findings that did not signify ACL deficiency. With an intact graft and optimal femoral bone tunnel placement, graft deficiency is unlikely, and the MRI examination should be carefully scrutinized for possible other causes for the patients symptoms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The present study deals with the application of cluster analysis, Fuzzy Cluster Analysis (FCA) and Kohonen Artificial Neural Networks (KANN) methods for classification of 159 meteorological stations in India into meteorologically homogeneous groups. Eight parameters, namely latitude, longitude, elevation, average temperature, humidity, wind speed, sunshine hours and solar radiation, are considered as the classification criteria for grouping. The optimal number of groups is determined as 14 based on the Davies-Bouldin index approach. It is observed that the FCA approach performed better than the other two methodologies for the present study.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we exploit the idea of decomposition to match buyers and sellers in an electronic exchange for trading large volumes of homogeneous goods, where the buyers and sellers specify marginal-decreasing piecewise constant price curves to capture volume discounts. Such exchanges are relevant for automated trading in many e-business applications. The problem of determining winners and Vickrey prices in such exchanges is known to have a worst-case complexity equal to that of as many as (1 + m + n) NP-hard problems, where m is the number of buyers and n is the number of sellers. Our method proposes the overall exchange problem to be solved as two separate and simpler problems: 1) forward auction and 2) reverse auction, which turns out to be generalized knapsack problems. In the proposed approach, we first determine the quantity of units to be traded between the sellers and the buyers using fast heuristics developed by us. Next, we solve a forward auction and a reverse auction using fully polynomial time approximation schemes available in the literature. The proposed approach has worst-case polynomial time complexity. and our experimentation shows that the approach produces good quality solutions to the problem. Note to Practitioners- In recent times, electronic marketplaces have provided an efficient way for businesses and consumers to trade goods and services. The use of innovative mechanisms and algorithms has made it possible to improve the efficiency of electronic marketplaces by enabling optimization of revenues for the marketplace and of utilities for the buyers and sellers. In this paper, we look at single-item, multiunit electronic exchanges. These are electronic marketplaces where buyers submit bids and sellers ask for multiple units of a single item. We allow buyers and sellers to specify volume discounts using suitable functions. Such exchanges are relevant for high-volume business-to-business trading of standard products, such as silicon wafers, very large-scale integrated chips, desktops, telecommunications equipment, commoditized goods, etc. The problem of determining winners and prices in such exchanges is known to involve solving many NP-hard problems. Our paper exploits the familiar idea of decomposition, uses certain algorithms from the literature, and develops two fast heuristics to solve the problem in a near optimal way in worst-case polynomial time.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Solving large-scale all-to-all comparison problems using distributed computing is increasingly significant for various applications. Previous efforts to implement distributed all-to-all comparison frameworks have treated the two phases of data distribution and comparison task scheduling separately. This leads to high storage demands as well as poor data locality for the comparison tasks, thus creating a need to redistribute the data at runtime. Furthermore, most previous methods have been developed for homogeneous computing environments, so their overall performance is degraded even further when they are used in heterogeneous distributed systems. To tackle these challenges, this paper presents a data-aware task scheduling approach for solving all-to-all comparison problems in heterogeneous distributed systems. The approach formulates the requirements for data distribution and comparison task scheduling simultaneously as a constrained optimization problem. Then, metaheuristic data pre-scheduling and dynamic task scheduling strategies are developed along with an algorithmic implementation to solve the problem. The approach provides perfect data locality for all comparison tasks, avoiding rearrangement of data at runtime. It achieves load balancing among heterogeneous computing nodes, thus enhancing the overall computation time. It also reduces data storage requirements across the network. The effectiveness of the approach is demonstrated through experimental studies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Genetic susceptibility to juvenile idiopathic arthritis (JIA) was studied in the genetically homogeneous Finnish population by collecting families with two or three patients affected by this disease from cases seen in the Rheumatism Foundation Hospital. The number of families ranged in different studies from 37 to 45 and the total number of patients with JIA, from among whom these cases were derived, was 2 000 to 2 300. Characteristics of the disease in affected siblings in Finland were compared with a population-based series and with a sibling series from the United States. A thorough clinical and ophthalmological examination was made of all affected patients belonging to sibpair series. Information on the occurrence of chronic rheumatic diseases in parents was collected by questionnaire and diagnoses were confirmed from hospital records. All patients, their parents and most of the healthy sibs were typed for human leukocyte antigen (HLA) alleles in loci A, C, B, DR and DQ. The HLA allele distribution of the cases was compared with corresponding data from Finnish bone marrow donors. The genetic component in JIA was found to be more significant than previously believed. A concordance rate of 25% for a disease with a population prevalence of 1 per 1000 implied a relative risk of 250 for a monozygotic (MZ) twin. An estimate for the sibling risk of an affected individual was about 15- to 20-fold. The disease was basically similar in familial and sporadic cases; the mean age at disease onset was however lower in familial cases, (4.8 years vs 7.4 years). Three sibpairs (3.4 expected) were concordant for the presence of asymptomatic uveitis. Uveitis would thus not appear to have any genetic component of its own, separate from the genetic basis of JIA. Four of the parents had JIA (0.2 cases expected), four had a type of rheumatoid factor-negative arthritis similar to that seen in juvenile patients but commencing in adulthood, and one had spondyloarthropathy (SPA). These findings provide additional support for the conception of a genetic predisposition to JIA and suggest the existence of a new disease entity, JIA of adult onset. Both the linkage analysis of the affected sibpairs and the association analysis of nuclear families provided overwhelming evidence of a major contribution of HLA to the genetic susceptibility to JIA. The association analysis in the Finnish population confirmed that the most significant associations prevailed for DRB1*0801, DQB1*0402, as expected from previous observations, and indicated the independent role of Cw*0401.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study is part of the joint project "The Genetic Epidemiology and Molecular Genetics of schizophrenia in Finland" between the Departments of Mental Health and Alcohol Research, and Molecular Medicine at the National Public Health Institute. In the study, we utilized three nationwide health care registers: 1) the Hospital Discharge Register, 2) the Free Medication Register, and 3) the Disability Pension Register, plus the National Population Register, in order to identify all patients with schizophrenia born from 1940 to 1976 (N=33,731) in Finland, and their first degree-relatives. 658 patients with at least one parent born in a homogeneous isolate in northeastern Finland were identified, as well as 4904 familial schizophrenia patients with at least two affected siblings from the whole country. The comparison group was derived from the Health 2000 Study. We collected case records and reassessed the register diagnosis. Were contacted the isolate patients and a random sample of patients from the whole country to make diagnostic clinical interviews and to assess the negative and positive symptoms and signs of schizophrenia. In addition to these patients, we interviewed siblings who were initially healthy according to the Hospital Discharge Register. Of those with a register diagnosis of schizophrenia, schizoaffective or schizophreniform disorder, 69% received a record-based consensus diagnosis and 63% an interview-based diagnosis of schizophrenia. Patients with schizophrenia having first-degree relatives with psychotic disorder had more severe affective flattening and alogia than those who were the only affected individuals in their family. The novel findings were: 1) The prevalence of schizophrenia in the isolate was relatively high based on register (1.5%), case record (0.9-1.3%), and interview (0.7-1.2%) data. 2) Isolate patients, regardless of their familial loading for schizophrenia, had less delusions and hallucinations than the whole country familial patients, which may be related to the genetic homogeneity in the isolate. This phenotype encourages the use of endophenotypes in genetic analyses instead of diagnoses alone. 3) The absence of register diagnosis does not confirm that siblings are healthy, because 7.7% of siblings had psychotic symptoms already before the register diagnoses were identified in 1991. For genetic research, the register diagnosis should therefore be reassessed using either a structured interview or a best- estimate case note consensus diagnosis. Structured clinical interview methods need be considered also in clinical practice.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We investigate the events near the fusion interfaces of dissimilar welds using a phase-field model developed for single-phase solidification of binary alloys. The parameters used here correspond to the dissimilar welding of a Ni/Cu couple. The events at the Ni and the Cu interface are very different, which illustrate the importance of the phase diagram through the slope of the liquidus curves. In the Ni side, where the liquidus temperature decreases with increasing alloying, solutal melting of the base metal takes place; the resolidification, with continuously increasing solid composition, is very sluggish until the interface encounters a homogeneous melt composition. The growth difficulty of the base metal increases with increasing initial melt composition, which is equivalent to a steeper slope of the liquidus curve. In the Cu side, the initial conditions result in a deeply undercooled melt and contributions from both constrained and unconstrained modes of growth are observed. The simulations bring out the possibility of nucleation of a concentrated solid phase from the melt, and a secondary melting of the substrate due to the associated recalescence event. The results for the Ni and Cu interfaces can be used to understand more complex dissimilar weld interfaces involving multiphase solidification.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A polymorphic ASIC is a runtime reconfigurable hardware substrate comprising compute and communication elements. It is a ldquofuture proofrdquo custom hardware solution for multiple applications and their derivatives in a domain. Interoperability between application derivatives at runtime is achieved through hardware reconfiguration. In this paper we present the design of a single cycle Network on Chip (NoC) router that is responsible for effecting runtime reconfiguration of the hardware substrate. The router design is optimized to avoid FIFO buffers at the input port and loop back at output crossbar. It provides virtual channels to emulate a non-blocking network and supports a simple X-Y relative addressing scheme to limit the control overhead to 9 bits per packet. The 8times8 honeycomb NoC (RECONNECT) implemented in 130 nm UMC CMOS standard cell library operates at 500 MHz and has a bisection bandwidth of 28.5 GBps. The network is characterized for random, self-similar and application specific traffic patterns that model the execution of multimedia and DSP kernels with varying network loads and virtual channels. Our implementation with 4 virtual channels has an average network latency of 24 clock cycles and throughput of 62.5% of the network capacity for random traffic. For application specific traffic the latency is 6 clock cycles and throughput is 87% of the network capacity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Scanning tunneling microscopy/spectroscopy studies were carried out on single crystals of colossal magnetoresistive manganite Pr0.68Pb0.32MnO3 at different temperatures in order to probe their spatial homogeneity across the metal-insulator transition temperature TM-I(similar to 255 K). A metallic behavior of the local conductance was observed for temperatures T < TM-I. Zero bias conductance (dI/dV)v=(0), which is directly proportional to the local surface density of states at the Fermi level, shows a single distribution at temperatures T < 200 K suggesting a homogeneous electronic phase at low temperatures. In a narrow temperature window of 200 K < T < TM-I, however, an inhomogeneous distribution of (dI/dV)v=(0) has been observed. This result gives evidence for phase separation in the transition region in this compound.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Scanning tunneling microscopy/spectroscopy studies were carried out on single crystals of colossal magnetoresistive manganite Pr0.68Pb0.32MnO3 at different temperatures in order to probe their spatial homogeneity across the metal-insulator transition temperature TM-I(similar to 255 K). A metallic behavior of the local conductance was observed for temperatures T < TM-I. Zero bias conductance (dI/dV)v=(0), which is directly proportional to the local surface density of states at the Fermi level, shows a single distribution at temperatures T < 200 K suggesting a homogeneous electronic phase at low temperatures. In a narrow temperature window of 200 K < T < TM-I, however, an inhomogeneous distribution of (dI/dV)v=(0) has been observed. This result gives evidence for phase separation in the transition region in this compound.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A better understanding of the limiting step in a first order phase transition, the nucleation process, is of major importance to a variety of scientific fields ranging from atmospheric sciences to nanotechnology and even to cosmology. This is due to the fact that in most phase transitions the new phase is separated from the mother phase by a free energy barrier. This barrier is crossed in a process called nucleation. Nowadays it is considered that a significant fraction of all atmospheric particles is produced by vapor-to liquid nucleation. In atmospheric sciences, as well as in other scientific fields, the theoretical treatment of nucleation is mostly based on a theory known as the Classical Nucleation Theory. However, the Classical Nucleation Theory is known to have only a limited success in predicting the rate at which vapor-to-liquid nucleation takes place at given conditions. This thesis studies the unary homogeneous vapor-to-liquid nucleation from a statistical mechanics viewpoint. We apply Monte Carlo simulations of molecular clusters to calculate the free energy barrier separating the vapor and liquid phases and compare our results against the laboratory measurements and Classical Nucleation Theory predictions. According to our results, the work of adding a monomer to a cluster in equilibrium vapour is accurately described by the liquid drop model applied by the Classical Nucleation Theory, once the clusters are larger than some threshold size. The threshold cluster sizes contain only a few or some tens of molecules depending on the interaction potential and temperature. However, the error made in modeling the smallest of clusters as liquid drops results in an erroneous absolute value for the cluster work of formation throughout the size range, as predicted by the McGraw-Laaksonen scaling law. By calculating correction factors to Classical Nucleation Theory predictions for the nucleation barriers of argon and water, we show that the corrected predictions produce nucleation rates that are in good comparison with experiments. For the smallest clusters, the deviation between the simulation results and the liquid drop values are accurately modelled by the low order virial coefficients at modest temperatures and vapour densities, or in other words, in the validity range of the non-interacting cluster theory by Frenkel, Band and Bilj. Our results do not indicate a need for a size dependent replacement free energy correction. The results also indicate that Classical Nucleation Theory predicts the size of the critical cluster correctly. We also presents a new method for the calculation of the equilibrium vapour density, surface tension size dependence and planar surface tension directly from cluster simulations. We also show how the size dependence of the cluster surface tension in equimolar surface is a function of virial coefficients, a result confirmed by our cluster simulations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Agricultural pests are responsible for millions of dollars in crop losses and management costs every year. In order to implement optimal site-specific treatments and reduce control costs, new methods to accurately monitor and assess pest damage need to be investigated. In this paper we explore the combination of unmanned aerial vehicles (UAV), remote sensing and machine learning techniques as a promising technology to address this challenge. The deployment of UAVs as a sensor platform is a rapidly growing field of study for biosecurity and precision agriculture applications. In this experiment, a data collection campaign is performed over a sorghum crop severely damaged by white grubs (Coleoptera: Scarabaeidae). The larvae of these scarab beetles feed on the roots of plants, which in turn impairs root exploration of the soil profile. In the field, crop health status could be classified according to three levels: bare soil where plants were decimated, transition zones of reduced plant density and healthy canopy areas. In this study, we describe the UAV platform deployed to collect high-resolution RGB imagery as well as the image processing pipeline implemented to create an orthoimage. An unsupervised machine learning approach is formulated in order to create a meaningful partition of the image into each of the crop levels. The aim of the approach is to simplify the image analysis step by minimizing user input requirements and avoiding the manual data labeling necessary in supervised learning approaches. The implemented algorithm is based on the K-means clustering algorithm. In order to control high-frequency components present in the feature space, a neighbourhood-oriented parameter is introduced by applying Gaussian convolution kernels prior to K-means. The outcome of this approach is a soft K-means algorithm similar to the EM algorithm for Gaussian mixture models. The results show the algorithm delivers decision boundaries that consistently classify the field into three clusters, one for each crop health level. The methodology presented in this paper represents a venue for further research towards automated crop damage assessments and biosecurity surveillance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nucleation is the first step in the formation of a new phase inside a mother phase. Two main forms of nucleation can be distinguished. In homogeneous nucleation, the new phase is formed in a uniform substance. In heterogeneous nucleation, on the other hand, the new phase emerges on a pre-existing surface (nucleation site). Nucleation is the source of about 30% of all atmospheric aerosol which in turn has noticeable health effects and a significant impact on climate. Nucleation can be observed in the atmosphere, studied experimentally in the laboratory and is the subject of ongoing theoretical research. This thesis attempts to be a link between experiment and theory. By comparing simulation results to experimental data, the aim is to (i) better understand the experiments and (ii) determine where the theory needs improvement. Computational fluid dynamics (CFD) tools were used to simulate homogeneous onecomponent nucleation of n-alcohols in argon and helium as carrier gases, homogeneous nucleation in the water-sulfuric acid-system, and heterogeneous nucleation of water vapor on silver particles. In the nucleation of n-alcohols, vapor depletion, carrier gas effect and carrier gas pressure effect were evaluated, with a special focus on the pressure effect whose dependence on vapor and carrier gas properties could be specified. The investigation of nucleation in the water-sulfuric acid-system included a thorough analysis of the experimental setup, determining flow conditions, vapor losses, and nucleation zone. Experimental nucleation rates were compared to various theoretical approaches. We found that none of the considered theoretical descriptions of nucleation captured the role of water in the process at all relative humidities. Heterogeneous nucleation was studied in the activation of silver particles in a TSI 3785 particle counter which uses water as its working fluid. The role of the contact angle was investigated and the influence of incoming particle concentrations and homogeneous nucleation on counting efficiency determined.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Our attention, is focused on designing an optimal procurement mechanism which a buyer can use for procuring multiple units of a homogeneous item based on bids submitted by autonomous, rational, and intelligent suppliers. We design elegant optimal procurement mechanisms for two different situations. In the first situation, each supplier specifies the maximum quantity that can be supplied together with a per unit price. For this situation, we design an optimal mechanism S-OPT (Optimal with Simple bids). In the more generalized case, each supplier specifies discounts based on the volume of supply. In this case, we design an optimal mechanism VD-OPT (Optimal with Volume Discount, bids). The VD-OPT mechanism uses the S-OPT mechanism as a building block. The proposed mechanisms minimize the cost to the buyer, satisfying at the same time, (a) Bayesian, incentive compatibility and (b) interim individual rationality.