990 resultados para mathematical sublime series


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this PhD study, mathematical modelling and optimisation of granola production has been carried out. Granola is an aggregated food product used in breakfast cereals and cereal bars. It is a baked crispy food product typically incorporating oats, other cereals and nuts bound together with a binder, such as honey, water and oil, to form a structured unit aggregate. In this work, the design and operation of two parallel processes to produce aggregate granola products were incorporated: i) a high shear mixing granulation stage (in a designated granulator) followed by drying/toasting in an oven. ii) a continuous fluidised bed followed by drying/toasting in an oven. In addition, the particle breakage of granola during pneumatic conveying produced by both a high shear granulator (HSG) and fluidised bed granulator (FBG) process were examined. Products were pneumatically conveyed in a purpose built conveying rig designed to mimic product conveying and packaging. Three different conveying rig configurations were employed; a straight pipe, a rig consisting two 45° bends and one with 90° bend. It was observed that the least amount of breakage occurred in the straight pipe while the most breakage occurred at 90° bend pipe. Moreover, lower levels of breakage were observed in two 45° bend pipe than the 90° bend vi pipe configuration. In general, increasing the impact angle increases the degree of breakage. Additionally for the granules produced in the HSG, those produced at 300 rpm have the lowest breakage rates while the granules produced at 150 rpm have the highest breakage rates. This effect clearly the importance of shear history (during granule production) on breakage rates during subsequent processing. In terms of the FBG there was no single operating parameter that was deemed to have a significant effect on breakage during subsequent conveying. A population balance model was developed to analyse the particle breakage occurring during pneumatic conveying. The population balance equations that govern this breakage process are solved using discretization. The Markov chain method was used for the solution of PBEs for this process. This study found that increasing the air velocity (by increasing the air pressure to the rig), results in increased breakage among granola aggregates. Furthermore, the analysis carried out in this work provides that a greater degree of breakage of granola aggregates occur in line with an increase in bend angle.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is estimated that the quantity of digital data being transferred, processed or stored at any one time currently stands at 4.4 zettabytes (4.4 × 2 70 bytes) and this figure is expected to have grown by a factor of 10 to 44 zettabytes by 2020. Exploiting this data is, and will remain, a significant challenge. At present there is the capacity to store 33% of digital data in existence at any one time; by 2020 this capacity is expected to fall to 15%. These statistics suggest that, in the era of Big Data, the identification of important, exploitable data will need to be done in a timely manner. Systems for the monitoring and analysis of data, e.g. stock markets, smart grids and sensor networks, can be made up of massive numbers of individual components. These components can be geographically distributed yet may interact with one another via continuous data streams, which in turn may affect the state of the sender or receiver. This introduces a dynamic causality, which further complicates the overall system by introducing a temporal constraint that is difficult to accommodate. Practical approaches to realising the system described above have led to a multiplicity of analysis techniques, each of which concentrates on specific characteristics of the system being analysed and treats these characteristics as the dominant component affecting the results being sought. The multiplicity of analysis techniques introduces another layer of heterogeneity, that is heterogeneity of approach, partitioning the field to the extent that results from one domain are difficult to exploit in another. The question is asked can a generic solution for the monitoring and analysis of data that: accommodates temporal constraints; bridges the gap between expert knowledge and raw data; and enables data to be effectively interpreted and exploited in a transparent manner, be identified? The approach proposed in this dissertation acquires, analyses and processes data in a manner that is free of the constraints of any particular analysis technique, while at the same time facilitating these techniques where appropriate. Constraints are applied by defining a workflow based on the production, interpretation and consumption of data. This supports the application of different analysis techniques on the same raw data without the danger of incorporating hidden bias that may exist. To illustrate and to realise this approach a software platform has been created that allows for the transparent analysis of data, combining analysis techniques with a maintainable record of provenance so that independent third party analysis can be applied to verify any derived conclusions. In order to demonstrate these concepts, a complex real world example involving the near real-time capturing and analysis of neurophysiological data from a neonatal intensive care unit (NICU) was chosen. A system was engineered to gather raw data, analyse that data using different analysis techniques, uncover information, incorporate that information into the system and curate the evolution of the discovered knowledge. The application domain was chosen for three reasons: firstly because it is complex and no comprehensive solution exists; secondly, it requires tight interaction with domain experts, thus requiring the handling of subjective knowledge and inference; and thirdly, given the dearth of neurophysiologists, there is a real world need to provide a solution for this domain

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cultural Marxist Theory, commonly known as theory, enjoyed a moment of extraordinary success in the 1970s, when the works of leading post-war French philosophers were published in English. After relocating to Anglophone academia, however, theory disavowed its original concerns and lost its ambition to understand the world as a whole, becoming the play of heterogeneities associated with postcolonialism, multiculturalism and identity politics, commonly referred to as postmodern theory. This turn, which took place during a period that seemed to have spelt the death of Marxism, the 1990s, induced many of its supporters to engage in an ongoing funeral wake designating the merits of theory and dreaming its resurgence. According to them, had theory been resurrected in historical circumstances completely different from those which had led to its rise, it would have never reacquired the significance that had originally connoted it. This thesis demonstrates how theory has survived its demise and entirely regained its prominence in our socio-political context marked by the effects of the latest crisis of capitalism and by the global threat of terrorisms rooted in messianic eschatologies. In its current form theory does no longer need to show allegiance to certain intellectual stances or political groupings in order to produce important reformulations of the projects it once gave life to. Though less overtly radical and epistemologically bounded, theory remains a necessary form of enquiry justified by the political commitment which originated it in the first place. Its voice continues to speak to us about justice ‘where it is not yet, not yet there, where it is no longer’ (Derrida, 1993, XVIII).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper confirms presence of GARCH(1,1) effect on stock return time series of Vietnam’s newborn stock market. We performed tests on four different time series, namely market returns (VN-Index), and return series of the first four individual stocks listed on the Vietnamese exchange (the Ho Chi Minh City Securities Trading Center) since August 2000. The results have been quite relevant to previously reported empirical studies on different markets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Continuing our development of a mathematical theory of stochastic microlensing, we study the random shear and expected number of random lensed images of different types. In particular, we characterize the first three leading terms in the asymptotic expression of the joint probability density function (pdf) of the random shear tensor due to point masses in the limit of an infinite number of stars. Up to this order, the pdf depends on the magnitude of the shear tensor, the optical depth, and the mean number of stars through a combination of radial position and the star's mass. As a consequence, the pdf's of the shear components are seen to converge, in the limit of an infinite number of stars, to shifted Cauchy distributions, which shows that the shear components have heavy tails in that limit. The asymptotic pdf of the shear magnitude in the limit of an infinite number of stars is also presented. All the results on the random microlensing shear are given for a general point in the lens plane. Extending to the general random distributions (not necessarily uniform) of the lenses, we employ the Kac-Rice formula and Morse theory to deduce general formulas for the expected total number of images and the expected number of saddle images. We further generalize these results by considering random sources defined on a countable compact covering of the light source plane. This is done to introduce the notion of global expected number of positive parity images due to a general lensing map. Applying the result to microlensing, we calculate the asymptotic global expected number of minimum images in the limit of an infinite number of stars, where the stars are uniformly distributed. This global expectation is bounded, while the global expected number of images and the global expected number of saddle images diverge as the order of the number of stars. © 2009 American Institute of Physics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Serotonin is a neurotransmitter that has been linked to a wide variety of behaviors including feeding and body-weight regulation, social hierarchies, aggression and suicidality, obsessive compulsive disorder, alcoholism, anxiety, and affective disorders. Full understanding of serotonergic systems in the central nervous system involves genomics, neurochemistry, electrophysiology, and behavior. Though associations have been found between functions at these different levels, in most cases the causal mechanisms are unknown. The scientific issues are daunting but important for human health because of the use of selective serotonin reuptake inhibitors and other pharmacological agents to treat disorders in the serotonergic signaling system. METHODS: We construct a mathematical model of serotonin synthesis, release, and reuptake in a single serotonergic neuron terminal. The model includes the effects of autoreceptors, the transport of tryptophan into the terminal, and the metabolism of serotonin, as well as the dependence of release on the firing rate. The model is based on real physiology determined experimentally and is compared to experimental data. RESULTS: We compare the variations in serotonin and dopamine synthesis due to meals and find that dopamine synthesis is insensitive to the availability of tyrosine but serotonin synthesis is sensitive to the availability of tryptophan. We conduct in silico experiments on the clearance of extracellular serotonin, normally and in the presence of fluoxetine, and compare to experimental data. We study the effects of various polymorphisms in the genes for the serotonin transporter and for tryptophan hydroxylase on synthesis, release, and reuptake. We find that, because of the homeostatic feedback mechanisms of the autoreceptors, the polymorphisms have smaller effects than one expects. We compute the expected steady concentrations of serotonin transporter knockout mice and compare to experimental data. Finally, we study how the properties of the the serotonin transporter and the autoreceptors give rise to the time courses of extracellular serotonin in various projection regions after a dose of fluoxetine. CONCLUSIONS: Serotonergic systems must respond robustly to important biological signals, while at the same time maintaining homeostasis in the face of normal biological fluctuations in inputs, expression levels, and firing rates. This is accomplished through the cooperative effect of many different homeostatic mechanisms including special properties of the serotonin transporters and the serotonin autoreceptors. Many difficult questions remain in order to fully understand how serotonin biochemistry affects serotonin electrophysiology and vice versa, and how both are changed in the presence of selective serotonin reuptake inhibitors. Mathematical models are useful tools for investigating some of these questions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: To develop a mathematical model that can predict refractive changes after Descemet stripping endothelial keratoplasty (DSEK). METHODS: A mathematical formula based on the Gullstrand eye model was generated to estimate the change in refractive power of the eye after DSEK. This model was applied to four DSEK cases retrospectively, to compare measured and predicted refractive changes after DSEK. RESULTS: The refractive change after DSEK is determined by calculating the difference in the power of the eye before and after DSEK surgery. The power of the eye post-DSEK surgery can be calculated with modified Gullstrand eye model equations that incorporate the change in the posterior radius of curvature and change in the distance between the principal planes of the cornea and lens after DSEK. Analysis of this model suggests that the ratio of central to peripheral graft thickness (CP ratio) and central thickness can have significant effect on refractive change where smaller CP ratios and larger graft thicknesses result in larger hyperopic shifts. This model was applied to four patients, and the average predicted hyperopic shift in the overall power of the eye was calculated to be 0.83 D. This change reflected in a mean of 93% (range, 75%-110%) of patients' measured refractive shifts. CONCLUSIONS: This simplified DSEK mathematical model can be used as a first step for estimating the hyperopic shift after DSEK. Further studies are necessary to refine the validity of this model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The application of semantic technologies to the integration of biological data and the interoperability of bioinformatics analysis and visualization tools has been the common theme of a series of annual BioHackathons hosted in Japan for the past five years. Here we provide a review of the activities and outcomes from the BioHackathons held in 2011 in Kyoto and 2012 in Toyama. In order to efficiently implement semantic technologies in the life sciences, participants formed various sub-groups and worked on the following topics: Resource Description Framework (RDF) models for specific domains, text mining of the literature, ontology development, essential metadata for biological databases, platforms to enable efficient Semantic Web technology development and interoperability, and the development of applications for Semantic Web data. In this review, we briefly introduce the themes covered by these sub-groups. The observations made, conclusions drawn, and software development projects that emerged from these activities are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

© Institute of Mathematical Statistics, 2014.Motivated by recent findings in the field of consumer science, this paper evaluates the causal effect of debit cards on household consumption using population-based data from the Italy Survey on Household Income and Wealth (SHIW). Within the Rubin Causal Model, we focus on the estimand of population average treatment effect for the treated (PATT). We consider three existing estimators, based on regression, mixed matching and regression, propensity score weighting, and propose a new doubly-robust estimator. Semiparametric specification based on power series for the potential outcomes and the propensity score is adopted. Cross-validation is used to select the order of the power series. We conduct a simulation study to compare the performance of the estimators. The key assumptions, overlap and unconfoundedness, are systematically assessed and validated in the application. Our empirical results suggest statistically significant positive effects of debit cards on the monthly household spending in Italy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We introduce a dynamic directional model (DDM) for studying brain effective connectivity based on intracranial electrocorticographic (ECoG) time series. The DDM consists of two parts: a set of differential equations describing neuronal activity of brain components (state equations), and observation equations linking the underlying neuronal states to observed data. When applied to functional MRI or EEG data, DDMs usually have complex formulations and thus can accommodate only a few regions, due to limitations in spatial resolution and/or temporal resolution of these imaging modalities. In contrast, we formulate our model in the context of ECoG data. The combined high temporal and spatial resolution of ECoG data result in a much simpler DDM, allowing investigation of complex connections between many regions. To identify functionally segregated sub-networks, a form of biologically economical brain networks, we propose the Potts model for the DDM parameters. The neuronal states of brain components are represented by cubic spline bases and the parameters are estimated by minimizing a log-likelihood criterion that combines the state and observation equations. The Potts model is converted to the Potts penalty in the penalized regression approach to achieve sparsity in parameter estimation, for which a fast iterative algorithm is developed. The methods are applied to an auditory ECoG dataset.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Controversies exist regarding the indications for unicompartmental knee arthroplasty. The objective of this study is to report the mid-term results and examine predictors of failure in a metal-backed unicompartmental knee arthroplasty design. METHODS: At a mean follow-up of 60 months, 80 medial unicompartmental knee arthroplasties (68 patients) were evaluated. Implant survivorship was analyzed using Kaplan-Meier method. The Knee Society objective and functional scores and radiographic characteristics were compared before surgery and at final follow-up. A Cox proportional hazard model was used to examine the association of patient's age, gender, obesity (body mass index > 30 kg/m2), diagnosis, Knee Society scores and patella arthrosis with failure. RESULTS: There were 9 failures during the follow up. The mean Knee Society objective and functional scores were respectively 49 and 48 points preoperatively and 95 and 92 points postoperatively. The survival rate was 92% at 5 years and 84% at 10 years. The mean age was younger in the failure group than the non-failure group (p < 0.01). However, none of the factors assessed was independently associated with failure based on the results from the Cox proportional hazard model. CONCLUSION: Gender, pre-operative diagnosis, preoperative objective and functional scores and patellar osteophytes were not independent predictors of failure of unicompartmental knee implants, although high body mass index trended toward significance. The findings suggest that the standard criteria for UKA may be expanded without compromising the outcomes, although caution may be warranted in patients with very high body mass index pending additional data to confirm our results. LEVEL OF EVIDENCE: IV.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Development of hip adductor, tensor fascia lata, and rectus femoris muscle contractures following total hip arthroplasties are quite common, with some patients failing to improve despite treatment with a variety of non-operative modalities. The purpose of the present study was to describe the use of and patient outcomes of botulinum toxin injections as an adjunctive treatment for muscle tightness following total hip arthroplasty. METHODS: Ten patients (14 hips) who had hip adductor, abductor, and/or flexor muscle contractures following total arthroplasty and had been refractory to physical therapeutic efforts were treated with injection of botulinum toxin A. Eight limbs received injections into the adductor muscle, 8 limbs received injections into the tensor fascia lata muscle, and 2 limbs received injection into the rectus femoris muscle, followed by intensive physical therapy for 6 weeks. RESULTS: At a mean final follow-up of 20 months, all 14 hips had increased range in the affected arc of motion, with a mean improvement of 23 degrees (range, 10 to 45 degrees). Additionally all hips had an improvement in hip scores, with a significant increase in mean score from 74 points (range, 57 to 91 points) prior to injection to a mean of 96 points (range, 93 to 98) at final follow-up. There were no serious treatment-related adverse events. CONCLUSION: Botulinum toxin A injections combined with intensive physical therapy may be considered as a potential treatment modality, especially in difficult cases of muscle tightness that are refractory to standard therapy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives This study aims to (1) discuss rare nasopharyngeal masses originating from embryologic remnants of the clivus, and (2) discuss the embryology of the clivus and understand its importance in the diagnosis and treatment of these masses. Design and Participants This is a case series of three patients. We discuss the clinical and imaging characteristics of infrasellar craniopharyngioma, intranasal extraosseous chordoma, and canalis basilaris medianus. Results Case 1: A 16-year-old male patient with a history of craniopharyngioma resection, who presented with nasal obstruction. A nasopharyngeal cystic mass was noted to be communicating with a patent craniopharyngeal canal. Histology revealed adamantinomatous craniopharyngioma. Case 2: A 43-year-old male patient who presented with nasal obstruction and headache. Computed tomography (CT) and magnetic resonance imaging revealed an enhancing polypoid mass in the posterior nasal cavity abutting the clivus. Histopathology revealed chondroid chordoma. Case 3: A 4-year-old female patient with a recurrent nasopharyngeal polyp. CT cisternogram showed that this mass may have risen from a bony defect of the middle clivus suggestive of canalis basilaris medianus. Conclusions Understanding the embryology of the clivus is crucial when considering the differential diagnosis of a nasopharyngeal mass. Identification of characteristic findings on imaging is critical in the diagnosis and treatment of these lesions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Biogas is a mixture of methane and other gases. In its crude state, it contains carbon dioxide (CO2) that reduces its energy efficiency and hydrogen sulfide (H2S) that is toxic and highly corrosive. Because chemical methods of removal are expensive and environmentally hazardous, this project investigated an algal-based system to remove CO2 from biogas. An anaerobic digester was used to mimic landfill biogas. Iron oxide and an alkaline spray were used to remove H2S and CO2 respectively. The CO2-laden alkali solution was added to a helical photobioreactor where the algae metabolized the dissolved CO2 to generate algal biomass. Although technical issues prevented testing of the complete system for functionality, cost analysis was completed and showed that the system, in its current state, is not economically feasible. However, modifications may reduce operation costs.