903 resultados para Algorithm-oriented design
Resumo:
Engineered muscle constructs provide a promising perspective on the regeneration or substitution of irreversibly damaged skeletal muscle. However, the highly ordered structure of native muscle tissue necessitates special consideration during scaffold development. Multiple approaches to the design of anisotropically structured substrates with grooved micropatterns or parallel-aligned fibres have previously been undertaken. In this study we report the guidance effect of a scaffold that combines both approaches, oriented fibres and a grooved topography. By electrospinning onto a topographically structured collector, matrices of parallel-oriented poly(ε-caprolactone) fibres with an imprinted wavy topography of 90 µm periodicity were produced. Matrices of randomly oriented fibres or parallel-oriented fibres without micropatterns served as controls. As previously shown, un-patterned, parallel-oriented substrates induced myotube orientation that is parallel to fibre direction. Interestingly, pattern addition induced an orientation of myotubes at an angle of 24° (statistical median) relative to fibre orientation. Myotube length was significantly increased on aligned micropatterned substrates in comparison to that on aligned substrates without pattern (436 ± 245 µm versus 365 ± 212 µm; p < 0.05). We report an innovative, yet simple, design to produce micropatterned electrospun scaffolds that induce an unexpected myotube orientation and an increase in myotube length.
Resumo:
OBJECTIVE: The aim of this study was to compare the results of tendency-oriented perimetry (TOP) and a dynamic strategy in octopus perimetry as screening methods in clinical practice. DESIGN: A prospective single centre observational case series was performed. PARTICIPANTS AND METHODS: In a newly opened general ophthalmologic practice 89 consecutive patients (171 eyes) with a clinical indication for octopus static perimetry testing (ocular hypertension or suspicious optic nerve cupping) were examined prospectively with TOP and a dynamic strategy. The visual fields were graded by 3 masked observers as normal, borderline or abnormal without any further clinical information. RESULTS: 83% eyes showed the same result for both strategies. In 14% there was a small difference (with one visual field being abnormal or normal, the other being borderline). In only 2.9% of the eyes (5 cases) was there a contradictory result. In 4 out of 5 cases the dynamic visual field was abnormal and TOP was normal. 4 of these cases came back for a second examination. In all 4 the follow-up examination showed a normal second dynamic visual field. CONCLUSIONS: Octopus static perimetry using a TOP strategy is a fast, patient-friendly and very reliable screening tool for the general ophthalmological practice. We found no false-negative results in our series.
Resumo:
Heterosis is widely used in breeding, but the genetic basis of this biological phenomenon has not been elucidated. We postulate that additive and dominance genetic effects as well as two-locus interactions estimated in classical QTL analyses are not sufficient for quantifying the contributions of QTL to heterosis. A general theoretical framework for determining the contributions of different types of genetic effects to heterosis was developed. Additive x additive epistatic interactions of individual loci with the entire genetic background were identified as a major component of midparent heterosis. On the basis of these findings we defined a new type of heterotic effect denoted as augmented dominance effect di* that comprises the dominance effect at each QTL minus half the sum of additive x additive interactions with all other QTL. We demonstrate that genotypic expectations of QTL effects obtained from analyses with the design III using testcrosses of recombinant inbred lines and composite-interval mapping precisely equal genotypic expectations of midparent heterosis, thus identifying genomic regions relevant for expression of heterosis. The theory for QTL mapping of multiple traits is extended to the simultaneous mapping of newly defined genetic effects to improve the power of QTL detection and distinguish between dominance and overdominance.
Resumo:
Neuromorphic computing has become an emerging field in wide range of applications. Its challenge lies in developing a brain-inspired architecture that can emulate human brain and can work for real time applications. In this report a flexible neural architecture is presented which consists of 128 X 128 SRAM crossbar memory and 128 spiking neurons. For Neuron, digital integrate and fire model is used. All components are designed in 45nm technology node. The core can be configured for certain Neuron parameters, Axon types and synapses states and are fully digitally implemented. Learning for this architecture is done offline. To train this circuit a well-known algorithm Restricted Boltzmann Machine (RBM) is used and linear classifiers are trained at the output of RBM. Finally, circuit was tested for handwritten digit recognition application. Future prospects for this architecture are also discussed.
DESIGN AND IMPLEMENT DYNAMIC PROGRAMMING BASED DISCRETE POWER LEVEL SMART HOME SCHEDULING USING FPGA
Resumo:
With the development and capabilities of the Smart Home system, people today are entering an era in which household appliances are no longer just controlled by people, but also operated by a Smart System. This results in a more efficient, convenient, comfortable, and environmentally friendly living environment. A critical part of the Smart Home system is Home Automation, which means that there is a Micro-Controller Unit (MCU) to control all the household appliances and schedule their operating times. This reduces electricity bills by shifting amounts of power consumption from the on-peak hour consumption to the off-peak hour consumption, in terms of different “hour price”. In this paper, we propose an algorithm for scheduling multi-user power consumption and implement it on an FPGA board, using it as the MCU. This algorithm for discrete power level tasks scheduling is based on dynamic programming, which could find a scheduling solution close to the optimal one. We chose FPGA as our system’s controller because FPGA has low complexity, parallel processing capability, a large amount of I/O interface for further development and is programmable on both software and hardware. In conclusion, it costs little time running on FPGA board and the solution obtained is good enough for the consumers.
Resumo:
Abstract Radiation metabolomics employing mass spectral technologies represents a plausible means of high-throughput minimally invasive radiation biodosimetry. A simplified metabolomics protocol is described that employs ubiquitous gas chromatography-mass spectrometry and open source software including random forests machine learning algorithm to uncover latent biomarkers of 3 Gy gamma radiation in rats. Urine was collected from six male Wistar rats and six sham-irradiated controls for 7 days, 4 prior to irradiation and 3 after irradiation. Water and food consumption, urine volume, body weight, and sodium, potassium, calcium, chloride, phosphate and urea excretion showed major effects from exposure to gamma radiation. The metabolomics protocol uncovered several urinary metabolites that were significantly up-regulated (glyoxylate, threonate, thymine, uracil, p-cresol) and down-regulated (citrate, 2-oxoglutarate, adipate, pimelate, suberate, azelaate) as a result of radiation exposure. Thymine and uracil were shown to derive largely from thymidine and 2'-deoxyuridine, which are known radiation biomarkers in the mouse. The radiation metabolomic phenotype in rats appeared to derive from oxidative stress and effects on kidney function. Gas chromatography-mass spectrometry is a promising platform on which to develop the field of radiation metabolomics further and to assist in the design of instrumentation for use in detecting biological consequences of environmental radiation release.
Resumo:
In this paper, a computer-aided diagnostic (CAD) system for the classification of hepatic lesions from computed tomography (CT) images is presented. Regions of interest (ROIs) taken from nonenhanced CT images of normal liver, hepatic cysts, hemangiomas, and hepatocellular carcinomas have been used as input to the system. The proposed system consists of two modules: the feature extraction and the classification modules. The feature extraction module calculates the average gray level and 48 texture characteristics, which are derived from the spatial gray-level co-occurrence matrices, obtained from the ROIs. The classifier module consists of three sequentially placed feed-forward neural networks (NNs). The first NN classifies into normal or pathological liver regions. The pathological liver regions are characterized by the second NN as cyst or "other disease." The third NN classifies "other disease" into hemangioma or hepatocellular carcinoma. Three feature selection techniques have been applied to each individual NN: the sequential forward selection, the sequential floating forward selection, and a genetic algorithm for feature selection. The comparative study of the above dimensionality reduction methods shows that genetic algorithms result in lower dimension feature vectors and improved classification performance.
Resumo:
This paper presents the results of a comprehensive literature review of the organization of purchasing covering the period from 1967 to 2009. The review provides a structured overview of prior research topics and findings and identifies gaps in the existing literature that may be addressed in future research. The intention of the review is to a) synthesize prior research, b) provide researchers with a structural framework on which future research on the organization of purchasing may be oriented, and c) suggest promising areas for future research.
Resumo:
Continuous advancements in technology have led to increasingly comprehensive and distributed product development processes while in pursuit of improved products at reduced costs. Information associated with these products is ever changing, and structured frameworks have become integral to managing such fluid information. Ontologies and the Semantic Web have emerged as key alternatives for capturing product knowledge in both a human-readable and computable manner. The primary and conclusive focus of this research is to characterize relationships formed within methodically developed distributed design knowledge frameworks to ultimately provide a pervasive real-time awareness in distributed design processes. Utilizing formal logics in the form of the Semantic Web’s OWL and SWRL, causal relationships are expressed to guide and facilitate knowledge acquisition as well as identify contradictions between knowledge in a knowledge base. To improve the efficiency during both the development and operational phases of these “intelligent” frameworks, a semantic relatedness algorithm is designed specifically to identify and rank underlying relationships within product development processes. After reviewing several semantic relatedness measures, three techniques, including a novel meronomic technique, are combined to create AIERO, the Algorithm for Identifying Engineering Relationships in Ontologies. In determining its applicability and accuracy, AIERO was applied to three separate, independently developed ontologies. The results indicate AIERO is capable of consistently returning relatedness values one would intuitively expect. To assess the effectiveness of AIERO in exposing underlying causal relationships across product development platforms, a case study involving the development of an industry-inspired printed circuit board (PCB) is presented. After instantiating the PCB knowledge base and developing an initial set of rules, FIDOE, the Framework for Intelligent Distributed Ontologies in Engineering, was employed to identify additional causal relationships through extensional relatedness measurements. In a conclusive PCB redesign, the resulting “intelligent” framework demonstrates its ability to pass values between instances, identify inconsistencies amongst instantiated knowledge, and identify conflicting values within product development frameworks. The results highlight how the introduced semantic methods can enhance the current knowledge acquisition, knowledge management, and knowledge validation capabilities of traditional knowledge bases.
Resumo:
Because of the unknown usage scenarios, designing the elementary services of a service-oriented architecture (SOA), which form the basis for later composition, is rather difficult. Various design guide lines have been proposed by academia, tool vendors and consulting companies, but they differ in the rigor of validation and are often biased toward some technology. For that reason a multiple-case study was conducted in five large organizations that successfully introduced SOA in their daily business. The observed approaches are contrasted with the findings from a literature review to derive some recommendations for SOA service design.
Resumo:
In numerous intervention studies and education field trials, random assignment to treatment occurs in clusters rather than at the level of observation. This departure of random assignment of units may be due to logistics, political feasibility, or ecological validity. Data within the same cluster or grouping are often correlated. Application of traditional regression techniques, which assume independence between observations, to clustered data produce consistent parameter estimates. However such estimators are often inefficient as compared to methods which incorporate the clustered nature of the data into the estimation procedure (Neuhaus 1993).1 Multilevel models, also known as random effects or random components models, can be used to account for the clustering of data by estimating higher level, or group, as well as lower level, or individual variation. Designing a study, in which the unit of observation is nested within higher level groupings, requires the determination of sample sizes at each level. This study investigates the design and analysis of various sampling strategies for a 3-level repeated measures design on the parameter estimates when the outcome variable of interest follows a Poisson distribution. ^ Results study suggest that second order PQL estimation produces the least biased estimates in the 3-level multilevel Poisson model followed by first order PQL and then second and first order MQL. The MQL estimates of both fixed and random parameters are generally satisfactory when the level 2 and level 3 variation is less than 0.10. However, as the higher level error variance increases, the MQL estimates become increasingly biased. If convergence of the estimation algorithm is not obtained by PQL procedure and higher level error variance is large, the estimates may be significantly biased. In this case bias correction techniques such as bootstrapping should be considered as an alternative procedure. For larger sample sizes, those structures with 20 or more units sampled at levels with normally distributed random errors produced more stable estimates with less sampling variance than structures with an increased number of level 1 units. For small sample sizes, sampling fewer units at the level with Poisson variation produces less sampling variation, however this criterion is no longer important when sample sizes are large. ^ 1Neuhaus J (1993). “Estimation efficiency and Tests of Covariate Effects with Clustered Binary Data”. Biometrics , 49, 989–996^
Resumo:
BACKGROUND Although well-established for suspected lower limb deep venous thrombosis, an algorithm combining a clinical decision score, d-dimer testing, and ultrasonography has not been evaluated for suspected upper extremity deep venous thrombosis (UEDVT). OBJECTIVE To assess the safety and feasibility of a new diagnostic algorithm in patients with clinically suspected UEDVT. DESIGN Diagnostic management study. (ClinicalTrials.gov: NCT01324037) SETTING: 16 hospitals in Europe and the United States. PATIENTS 406 inpatients and outpatients with suspected UEDVT. MEASUREMENTS The algorithm consisted of the sequential application of a clinical decision score, d-dimer testing, and ultrasonography. Patients were first categorized as likely or unlikely to have UEDVT; in those with an unlikely score and normal d-dimer levels, UEDVT was excluded. All other patients had (repeated) compression ultrasonography. The primary outcome was the 3-month incidence of symptomatic UEDVT and pulmonary embolism in patients with a normal diagnostic work-up. RESULTS The algorithm was feasible and completed in 390 of the 406 patients (96%). In 87 patients (21%), an unlikely score combined with normal d-dimer levels excluded UEDVT. Superficial venous thrombosis and UEDVT were diagnosed in 54 (13%) and 103 (25%) patients, respectively. All 249 patients with a normal diagnostic work-up, including those with protocol violations (n = 16), were followed for 3 months. One patient developed UEDVT during follow-up, for an overall failure rate of 0.4% (95% CI, 0.0% to 2.2%). LIMITATIONS This study was not powered to show the safety of the substrategies. d-Dimer testing was done locally. CONCLUSION The combination of a clinical decision score, d-dimer testing, and ultrasonography can safely and effectively exclude UEDVT. If confirmed by other studies, this algorithm has potential as a standard approach to suspected UEDVT. PRIMARY FUNDING SOURCE None.
Resumo:
Background: Motive-oriented therapeutic relationship (MOTR) was postulated to be a particularly helpful therapeutic ingredient in the early treatment phase of patients with personality disorders, in particular with borderline personality disorder (BPD). The present randomized controlled study using an add-on design is the first study to test this assumption in a 10-session general psychiatric treatment with patients presenting with BPD on symptom reduction and therapeutic alliance. Methods: A total of 85 patients were randomized. They were either allocated to a manual-based short variant of the general psychiatric management (GPM) treatment (in 10 sessions) or to the same treatment where MOTR was deliberately added to the treatment. Treatment attrition and integrity analyses yielded satisfactory results. Results: The results of the intent-to-treat analyses suggested a global efficacy of MOTR, in the sense of an additional reduction of general problems, i.e. symptoms, interpersonal and social problems (F 1, 73 = 7.25, p < 0.05). However, they also showed that MOTR did not yield an additional reduction of specific borderline symptoms. It was also shown that a stronger therapeutic alliance, as assessed by the therapist, developed in MOTR treatments compared to GPM (Z 55 = 0.99, p < 0.04). Conclusions: These results suggest that adding MOTR to psychiatric and psychotherapeutic treatments of BPD is promising. Moreover, the findings shed additional light on the perspective of shortening treatments for patients presenting with BPD.
Resumo:
OBJECTIVE The aim of the present study was to evaluate a dose reduction in contrast-enhanced chest computed tomography (CT) by comparing the three latest generations of Siemens CT scanners used in clinical practice. We analyzed the amount of radiation used with filtered back projection (FBP) and an iterative reconstruction (IR) algorithm to yield the same image quality. Furthermore, the influence on the radiation dose of the most recent integrated circuit detector (ICD; Stellar detector, Siemens Healthcare, Erlangen, Germany) was investigated. MATERIALS AND METHODS 136 Patients were included. Scan parameters were set to a thorax routine: SOMATOM Sensation 64 (FBP), SOMATOM Definition Flash (IR), and SOMATOM Definition Edge (ICD and IR). Tube current was set constantly to the reference level of 100 mA automated tube current modulation using reference milliamperes. Care kV was used on the Flash and Edge scanner, while tube potential was individually selected between 100 and 140 kVp by the medical technologists at the SOMATOM Sensation. Quality assessment was performed on soft-tissue kernel reconstruction. Dose was represented by the dose length product. RESULTS Dose-length product (DLP) with FBP for the average chest CT was 308 mGy*cm ± 99.6. In contrast, the DLP for the chest CT with IR algorithm was 196.8 mGy*cm ± 68.8 (P = 0.0001). Further decline in dose can be noted with IR and the ICD: DLP: 166.4 mGy*cm ± 54.5 (P = 0.033). The dose reduction compared to FBP was 36.1% with IR and 45.6% with IR/ICD. Signal-to-noise ratio (SNR) was favorable in the aorta, bone, and soft tissue for IR/ICD in combination compared to FBP (the P values ranged from 0.003 to 0.048). Overall contrast-to-noise ratio (CNR) improved with declining DLP. CONCLUSION The most recent technical developments, namely IR in combination with integrated circuit detectors, can significantly lower radiation dose in chest CT examinations.