996 resultados para Cure time
Resumo:
Partitional clustering algorithms, which partition the dataset into a pre-defined number of clusters, can be broadly classified into two types: algorithms which explicitly take the number of clusters as input and algorithms that take the expected size of a cluster as input. In this paper, we propose a variant of the k-means algorithm and prove that it is more efficient than standard k-means algorithms. An important contribution of this paper is the establishment of a relation between the number of clusters and the size of the clusters in a dataset through the analysis of our algorithm. We also demonstrate that the integration of this algorithm as a pre-processing step in classification algorithms reduces their running-time complexity.
Resumo:
In this paper, we present a low-complexity algorithm for detection in high-rate, non-orthogonal space-time block coded (STBC) large-multiple-input multiple-output (MIMO) systems that achieve high spectral efficiencies of the order of tens of bps/Hz. We also present a training-based iterative detection/channel estimation scheme for such large STBC MIMO systems. Our simulation results show that excellent bit error rate and nearness-to-capacity performance are achieved by the proposed multistage likelihood ascent search (M-LAS) detector in conjunction with the proposed iterative detection/channel estimation scheme at low complexities. The fact that we could show such good results for large STBCs like 16 X 16 and 32 X 32 STBCs from Cyclic Division Algebras (CDA) operating at spectral efficiencies in excess of 20 bps/Hz (even after accounting for the overheads meant for pilot based training for channel estimation and turbo coding) establishes the effectiveness of the proposed detector and channel estimator. We decode perfect codes of large dimensions using the proposed detector. With the feasibility of such a low-complexity detection/channel estimation scheme, large-MIMO systems with tens of antennas operating at several tens of bps/Hz spectral efficiencies can become practical, enabling interesting high data rate wireless applications.
Resumo:
The need for special education (SE) is increasing. The majority of those whose problems are due to neurodevelopmental disorders have no specific aetiology. The aim of this study was to evaluate the contribution of prenatal and perinatal factors and factors associated with growth and development to later need for full-time SE and to assess joint structural and volumetric brain alterations among subjects with unexplained, familial need for SE. A random sample of 900 subjects in full-time SE allocated into three levels of neurodevelopmental problems and 301 controls in mainstream education (ME) provided data on socioeconomic factors, pregnancy, delivery, growth, and development. Of those, 119 subjects belonging to a sibling-pair in full-time SE with unexplained aetiology and 43 controls in ME underwent brain magnetic resonance imaging (MRI). Analyses of structural brain alterations and midsagittal area and diameter measurements were made. Voxel-based morphometry (VBM) analysis provided detailed information on regional grey matter, white matter, and cerebrospinal fluid (CSF) volume differences. Father’s age ≥ 40 years, low birth weight, male sex, and lower socio-economic status all increased the probability of SE placement. At age 1 year, one standard deviation score decrease in height raised the probability of SE placement by 40% and in head circumference by 28%. At infancy, the gross motor milestones differentiated the children. From age 18 months, the fine motor milestones and those related to speech and social skills became more important. Brain MRI revealed no specific aetiology for subjects in SE. However, they had more often ≥ 3 abnormal findings in MRIs (thin corpus callosum and enlarged cerebral and cerebellar CSF spaces). In VBM, subjects in full-time SE had smaller global white matter, CSF, and total brain volumes than controls. Compared with controls, subjects with intellectual disabilities had regional volume alterations (greater grey matter volumes in the anterior cingulate cortex bilaterally, smaller grey matter volume in left thalamus and left cerebellar hemisphere, greater white matter volume in the left fronto-parietal region, and smaller white matter volumes bilaterally in the posterior limbs of the internal capsules). In conclusion, the epidemiological studies emphasized several factors that increased the probability of SE placement, useful as a framework for interventional studies. The global and regional brain MRI findings provide an interesting basis for future investigations of learning-related brain structures in young subjects with cognitive impairments or intellectual disabilities of unexplained, familial aetiology.
Resumo:
We propose a self-regularized pseudo-time marching scheme to solve the ill-posed, nonlinear inverse problem associated with diffuse propagation of coherent light in a tissuelike object. In particular, in the context of diffuse correlation tomography (DCT), we consider the recovery of mechanical property distributions from partial and noisy boundary measurements of light intensity autocorrelation. We prove the existence of a minimizer for the Newton algorithm after establishing the existence of weak solutions for the forward equation of light amplitude autocorrelation and its Frechet derivative and adjoint. The asymptotic stability of the solution of the ordinary differential equation obtained through the introduction of the pseudo-time is also analyzed. We show that the asymptotic solution obtained through the pseudo-time marching converges to that optimal solution provided the Hessian of the forward equation is positive definite in the neighborhood of optimal solution. The superior noise tolerance and regularization-insensitive nature of pseudo-dynamic strategy are proved through numerical simulations in the context of both DCT and diffuse optical tomography. (C) 2010 Optical Society of America.
Resumo:
The blue emission of ethyl-hexyl substituted polyfluorene (PF2/6) films is accompanied by a low energy green emission peak around 500 nm in inert atmosphere. The intensity of this 500 nm peak is large in electroluminescence (EL) compared to photoluminescence (PL)measurements. Furthermore, the green emission intensity reduces dramatically in the presence of molecular oxygen. To understand this, we have modeled various nonradiative processes by time dependent quantum many body methods. These are (i) intersystem crossing to study conversion of excited singlets to triplets leading to a phosphorescence emission, (ii) electron-hole recombination (e-hR) process in the presence of a paramagnetic impurity to follow the yield of triplets in a polyene system doped with paramagnetic metal atom, and (iii) quenching of excited triplet states in the presence of oxygen molecules to understand the low intensity of EL emission in ambient atmosphere, when compared with that in nitrogen atmosphere. We have employed the Pariser-Parr-Pople Hamiltonian to model the molecules and have invoked electron-electron repulsions beyond zero differential approximation while treating interactions between the organic molecule and the rest of the system. Our time evolution methods show that there is a large cross section for triplet formation in the e-hR process in the presence of paramagnetic impurity with degenerate orbitals. The triplet yield through e-hR process far exceeds that in the intersystem crossing pathway, clearly pointing to the large intensity of the 500 nm peak in EL compared to PL measurements. We have also modeled the triplet quenching process by a paramagnetic oxygen molecule which shows a sizable quenching cross section especially for systems with large sizes. These studies show that the most probable origin of the experimentally observed low energy EL emission is the triplets.
Resumo:
The incidence of gastric cancer in the last decades has declined rapidly in the industrialised countries. Worldwide, however, gastric cancer is still the second most common cause of cancer death. Although surgery is currently the most effective treatment, the rapid progress in adjuvant chemotherapy and radiation therapy requires a re-evaluation of prognosis assessment. The TNM staging system of the UICC is ubiquitously used; it groups patients by decreasing survival times from stage I to stage IV based on the spread of disease, i.e. depth of tumour penetration (T), extent of spread to lymph nodes (N), and the presence or absence of distant (M) metastases. This is by far the most consistent prognostic classification system today. However, even within the stage groups there are patients that follow a varying course of disease. Our knowledge of the molecular differences between tumours of the same stage and morphology has been accumulating over the years and methods for a more accurate assessment of the phenotype of neoplasias are of value when evaluating the prognosis of individual patients with gastric cancer. In this study, the immunohistochemical expression of tumour markers involved in different phases in tumourigenesis was examined. The aim was to find new markers which could provide prognostic information in addition to what is provided by the TNM variables. A total of 337 specimens from the primary tumour of patients who underwent surgery for gastric cancer were collected and the immunohistochemical expression of seven different biomarkers was analysed. DNA ploidy and S-phase fraction (SPF) was assessed by flow cytometry. Finally, all biomarkers and clinicopathological prognostic factors were combined and evaluated by a multivariate Cox regression model to elucidate which specific factors provide independent prognostic information. By univariate survival analysis the following variables were significant prognostic factors: epithelial and stromal syndecan-1 expression, stromal tenascin-C expression, expression of tumour-associated trypsin inhibitor (TATI) in cancer cells, nuclear p53 expression, nuclear p21 expression, DNA ploidy, and SPF. By multivariate survival analysis adjusted for all available clinicopathological and biomolecular variables, p53 expression, p21 expression, and DNA ploidy emerged as independent prognostic biomarkers, together with penetration depth of the tumour, presence of nodal metastases, surgical cure of the cancer, and age of the patient at the time of diagnosis.
Resumo:
Spirometry is the most widely used lung function test in the world. It is fundamental in diagnostic and functional evaluation of various pulmonary diseases. In the studies described in this thesis, the spirometric assessment of reversibility of bronchial obstruction, its determinants, and variation features are described in a general population sample from Helsinki, Finland. This study is a part of the FinEsS study, which is a collaborative study of clinical epidemiology of respiratory health between Finland (Fin), Estonia (Es), and Sweden (S). Asthma and chronic obstructive pulmonary disease (COPD) constitute the two major obstructive airways diseases. The prevalence of asthma has increased, with around 6% of the population in Helsinki reporting physician-diagnosed asthma. The main cause of COPD is smoking with changes in smoking habits in the population affecting its prevalence with a delay. Whereas airway obstruction in asthma is by definition reversible, COPD is characterized by fixed obstruction. Cough and sputum production, the first symptoms of COPD, are often misinterpreted for smokers cough and not recognized as first signs of a chronic illness. Therefore COPD is widely underdiagnosed. More extensive use of spirometry in primary care is advocated to focus smoking cessation interventions on populations at risk. The use of forced expiratory volume in six seconds (FEV6) instead of forced vital capacity (FVC) has been suggested to enable office spirometry to be used in earlier detection of airflow limitation. Despite being a widely accepted standard method of assessment of lung function, the methodology and interpretation of spirometry are constantly developing. In 2005, the ATS/ERS Task Force issued a joint statement which endorsed the 12% and 200 ml thresholds for significant change in forced expiratory volume in one second (FEV1) or FVC during bronchodilation testing, but included the notion that in cases where only FVC improves it should be verified that this is not caused by a longer exhalation time in post-bronchodilator spirometry. This elicited new interest in the assessment of forced expiratory time (FET), a spirometric variable not usually reported or used in assessment. In this population sample, we examined FET and found it to be on average 10.7 (SD 4.3) s and to increase with ageing and airflow limitation in spirometry. The intrasession repeatability of FET was the poorest of the spirometric variables assessed. Based on the intrasession repeatability, a limit for significant change of 3 s was suggested for FET during bronchodilation testing. FEV6 was found to perform equally well as FVC in the population and in a subgroup of subjects with airways obstruction. In the bronchodilation test, decreases were frequently observed in FEV1 and particularly in FVC. The limit of significant increase based on the 95th percentile of the population sample was 9% for FEV1 and 6% for FEV6 and FVC; these are slightly lower than the current limits for single bronchodilation tests (ATS/ERS guidelines). FEV6 was proven as a valid alternative to FVC also in the bronchodilation test and would remove the need to control duration of exhalation during the spirometric bronchodilation test.
Resumo:
A reduced 3D continuum model of dynamic piezoelectricity in a thin-film surface-bonded to the substrate/host is presented in this article. While employing large area flexible thin piezoelectric films for novel applications in device/diagnostics, the feasibility of the proposed model in sensing the surface and/or sub-surface defects is demonstrated through simulations - which involve metallic beams with cracks and composite beam with delaminations of various sizes. We have introduced a set of electrical measures to capture the severity of the damage in the existing structures. Characteristics of these electrical measures in terms of the potential difference and its spatial gradients are illustrated in the time domain. Sensitivity studies of the proposed measures in terms of the defected areas and their region of occurence relative to the sensing film are reported. The simulations' results for electrical measures for damaged hosts/substrates are compared with those due to undamaged hosts/substrates, which show monotonicity with high degree of sensitivity to variations in the damage parameters.
Resumo:
Information professionals and information organisations use Twitter in a variety of ways. Typically both organisations and the individuals that work for them have separate identities on Twitter, but often individuals identify their organization through their profile or Twitter content. This paper considers the way information professionals use Twitter and their practices with regard to privacy, personal disclosure and identifying their organisational affiliations. Drawing on data from a research study involving a questionnaire and social media observation, the paper will provoke discussion about information professionals’ use of Twitter, personal and organizational identity, and the value of Twitter for professional development. In keeping with the subject matter, a curated set of social media content will be available in lieu of a formal paper.
Resumo:
PURPOSE To study the utility of fractional calculus in modeling gradient-recalled echo MRI signal decay in the normal human brain. METHODS We solved analytically the extended time-fractional Bloch equations resulting in five model parameters, namely, the amplitude, relaxation rate, order of the time-fractional derivative, frequency shift, and constant offset. Voxel-level temporal fitting of the MRI signal was performed using the classical monoexponential model, a previously developed anomalous relaxation model, and using our extended time-fractional relaxation model. Nine brain regions segmented from multiple echo gradient-recalled echo 7 Tesla MRI data acquired from five participants were then used to investigate the characteristics of the extended time-fractional model parameters. RESULTS We found that the extended time-fractional model is able to fit the experimental data with smaller mean squared error than the classical monoexponential relaxation model and the anomalous relaxation model, which do not account for frequency shift. CONCLUSIONS We were able to fit multiple echo time MRI data with high accuracy using the developed model. Parameters of the model likely capture information on microstructural and susceptibility-induced changes in the human brain.
Resumo:
"Extended Clifford algebras" are introduced as a means to obtain low ML decoding complexity space-time block codes. Using left regular matrix representations of two specific classes of extended Clifford algebras, two systematic algebraic constructions of full diversity Distributed Space-Time Codes (DSTCs) are provided for any power of two number of relays. The left regular matrix representation has been shown to naturally result in space-time codes meeting the additional constraints required for DSTCs. The DSTCs so constructed have the salient feature of reduced Maximum Likelihood (ML) decoding complexity. In particular, the ML decoding of these codes can be performed by applying the lattice decoder algorithm on a lattice of four times lesser dimension than what is required in general. Moreover these codes have a uniform distribution of power among the relays and in time, thus leading to a low Peak to Average Power Ratio at the relays.
Resumo:
H-1 and F-19 spin-lattice relaxation times in polycrystalline diammonium hexafluorozirconate have been measured in the temperature range of 10-400 K to elucidate the molecular motion of both cation and anion. Interesting features such as translational diffusion at higher temperatures, molecular reorientational motion of both cation and anion groups at intermediate temperatures and quantum rotational tunneling of the ammonium group at lower temperatures have been observed. Nuclear magnetic resonance (NMR) relaxation time results correlate well with the NMR second moment and conductivity studies reported earlier.
Resumo:
The BeiDou system is the first global navigation satellite system in which all satellites transmit triple-frequency signals that can provide the positioning, navigation, and timing independently. A benefit of triple-frequency signals is that more useful combinations can be formed, including some extrawide-lane combinations whose ambiguities can generally be instantaneously fixed without distance restriction, although the narrow-lane ambiguity resolution (NL AR) still depends on the interreceiver distance or requires a long time to achieve. In this paper, we synthetically study decimeter and centimeter kinematic positioning using BeiDou triple-frequency signals. It starts with AR of two extrawide-lane signals based on the ionosphere-free or ionosphere-reduced geometry-free model. For decimeter positioning, one can immediately use two ambiguity-fixed extrawide-lane observations without pursuing NL AR. To achieve higher accuracy, NL AR is the necessary next step. Despite the fact that long-baseline NL AR is still challenging, some NL ambiguities can indeed be fixed with high reliability. Partial AR for NL signals is acceptable, because as long as some ambiguities for NL signals are fixed, positioning accuracy will be certainly improved.With accumulation of observations, more and more NL ambiguities are fixed and the positioning accuracy continues to improve. An efficient Kalman-filtering system is established to implement the whole process. The formulated system is flexible, since the additional constraints can be easily applied to enhance the model's strength. Numerical results from a set of real triple-frequency BeiDou data on a 50 km baseline show that decimeter positioning is achievable instantaneously.With only five data epochs, 84% of NL ambiguities can be fixed so that the real-time kinematic accuracies are 4.5, 2.5, and 16 cm for north, east, and height components (respectively), while with 10 data epochs more than 90% of NL ambiguities are fixed, and the rea- -time kinematic solutions are improved to centimeter level for all three coordinate components.
Resumo:
This paper addresses the problem of detecting and resolving conflicts due to timing constraints imposed by features in real-time systems. We consider systems composed of a base system with multiple features or controllers, each of which independently advise the system on how to react to input events so as to conform to their individual specifications. We propose a methodology for developing such systems in a modular manner based on the notion of conflict tolerant features that are designed to continue offering advice even when their advice has been overridden in the past. We give a simple priority based scheme for composing such features. This guarantees the maximal use of each feature. We provide a formal framework for specifying such features, and a compositional technique for verifying systems developed in this framework.
Resumo:
Design criteria and full-diversity Distributed Space Time Codes (DSTCs) for the two phase transmission based cooperative diversity protocol of Jing-Hassibi and the Generalized Nonorthogonal Amplify and Forward (GNAF) protocol are reported, when the relay nodes are assumed to have knowledge of the phase component of the source to relay channel gains. It is shown that this under this partial channel state information (CSI), several well known space time codes for the colocated MIMO (Multiple Input Multiple Output) channel become amenable for use as DSTCs. In particular, the well known complex orthogonal designs, generalized coordinate interleaved orthogonal designs (GCIODs) and unitary weight single symbol decodable (UW-SSD) codes are shown to satisfy the required design constraints for DSTCs. Exploiting the relaxed code design constraints, we propose DSTCs obtained from Clifford Algebras which have low ML decoding complexity.