948 resultados para soft computing methods
Resumo:
This paper surveys research in the field of data mining, which is related to discovering the dependencies between attributes in databases. We consider a number of approaches to finding the distribution intervals of association rules, to discovering branching dependencies between a given set of attributes and a given attribute in a database relation, to finding fractional dependencies between a given set of attributes and a given attribute in a database relation, and to collaborative filtering.
Resumo:
This article presents the principal results of the doctoral thesis “Direct Operational Methods in the Environment of a Computer Algebra System” by Margarita Spiridonova (Institute of mathematics and Informatics, BAS), successfully defended before the Specialised Academic Council for Informatics and Mathematical Modelling on 23 March, 2009.
Resumo:
We present quasi-Monte Carlo analogs of Monte Carlo methods for some linear algebra problems: solving systems of linear equations, computing extreme eigenvalues, and matrix inversion. Reformulating the problems as solving integral equations with a special kernels and domains permits us to analyze the quasi-Monte Carlo methods with bounds from numerical integration. Standard Monte Carlo methods for integration provide a convergence rate of O(N^(−1/2)) using N samples. Quasi-Monte Carlo methods use quasirandom sequences with the resulting convergence rate for numerical integration as good as O((logN)^k)N^(−1)). We have shown theoretically and through numerical tests that the use of quasirandom sequences improves both the magnitude of the error and the convergence rate of the considered Monte Carlo methods. We also analyze the complexity of considered quasi-Monte Carlo algorithms and compare them to the complexity of the analogous Monte Carlo and deterministic algorithms.
Resumo:
Report published in the Proceedings of the National Conference on "Education and Research in the Information Society", Plovdiv, May, 2015
Resumo:
Purpose: To investigate how initial HEMA and silicone-hydrogel (SiHy) contact lens fit on insertion, which informs prescribing decisions, reflect end of day fit. Methods: Thirty participants (aged 22.9. ±. 4.9 years) were fitted contralaterally with HEMA and SiHy contact lenses. Corneal topography and tear break-up time were assessed pre-lens wear. Centration, lag, post-blink movement during up-gaze and push-up recovery speed were recorded after 5,10,20. min and 8. h of contact lens wear by a digital slit-lamp biomicroscope camera, along with reported comfort. Lens fit metrics were analysed using bespoke software. Results: Comfort and centration were similar with the HEMA and SiHy lenses (p > 0.05), but comfort decreased with time (p <. 0.01) whereas centration remained stable (F = 0.036, p = 0.991). Movement-on-blink and lag were greater with the HEMA than the SiHy lens (p <. 0.01), but movement-on-blink decreased with time after insertion (F = 22.423, p <. 0.001) whereas lag remained stable (F = 1.967, p = 0.129). Push-up recovery speed was similar with the HEMA and the SiHy lens 5-20. min after insertion (p > 0.05), but was slower with SiHy after 8. h wear (p = 0.016). Lens movement on blink and push-up recovery speed was predictive of the movement after 8. h of wear after 10-20. min SiHy wear, but after 5 to 20. min of HEMA lens wear. Conclusions: A HEMA or SiHy contact lens with poor movement on blink/push-up after at least 10. min after insertion should be rejected.
Resumo:
One of the most pressing demands on electrophysiology applied to the diagnosis of epilepsy is the non-invasive localization of the neuronal generators responsible for brain electrical and magnetic fields (the so-called inverse problem). These neuronal generators produce primary currents in the brain, which together with passive currents give rise to the EEG signal. Unfortunately, the signal we measure on the scalp surface doesn't directly indicate the location of the active neuronal assemblies. This is the expression of the ambiguity of the underlying static electromagnetic inverse problem, partly due to the relatively limited number of independent measures available. A given electric potential distribution recorded at the scalp can be explained by the activity of infinite different configurations of intracranial sources. In contrast, the forward problem, which consists of computing the potential field at the scalp from known source locations and strengths with known geometry and conductivity properties of the brain and its layers (CSF/meninges, skin and skull), i.e. the head model, has a unique solution. The head models vary from the computationally simpler spherical models (three or four concentric spheres) to the realistic models based on the segmentation of anatomical images obtained using magnetic resonance imaging (MRI). Realistic models – computationally intensive and difficult to implement – can separate different tissues of the head and account for the convoluted geometry of the brain and the significant inter-individual variability. In real-life applications, if the assumptions of the statistical, anatomical or functional properties of the signal and the volume in which it is generated are meaningful, a true three-dimensional tomographic representation of sources of brain electrical activity is possible in spite of the ‘ill-posed’ nature of the inverse problem (Michel et al., 2004). The techniques used to achieve this are now referred to as electrical source imaging (ESI) or magnetic source imaging (MSI). The first issue to influence reconstruction accuracy is spatial sampling, i.e. the number of EEG electrodes. It has been shown that this relationship is not linear, reaching a plateau at about 128 electrodes, provided spatial distribution is uniform. The second factor is related to the different properties of the source localization strategies used with respect to the hypothesized source configuration.
Resumo:
Purpose - To investigate if the accuracy of intraocular pressure (IOP) measurements using rebound tonometry over disposable hydrogel (etafilcon A) contact lenses (CL) is affected by the positive power of the CLs. Methods - The experimental group comprised 26 subjects, (8 male, 18 female). IOP measurements were undertaken on the subjects’ right eyes in random order using a Rebound Tonometer (ICare). The CLs had powers of +2.00 D and +6.00 D. Measurements were taken over each contact lens and also before and after the CLs had been worn. Results - The IOP measure obtained with both CLs was significantly lower compared to the value without CLs (t test; p < 0.001) but no significant difference was found between the two powers of CLs. Conclusions - Rebound tonometry over positive hydrogel CLs leads to a certain degree of IOP underestimation. This result did not change for the two positive lenses used in the experiment, despite their large difference in power and therefore in lens thickness. Optometrists should bear this in mind when measuring IOP with the rebound tonometer over plus power contact lenses.
Resumo:
This chapter explores ways in which rigorous mathematical techniques, termed formal methods, can be employed to improve the predictability and dependability of autonomic computing. Model checking, formal specification, and quantitative verification are presented in the contexts of conflict detection in autonomic computing policies, and of implementation of goal and utility-function policies in autonomic IT systems, respectively. Each of these techniques is illustrated using a detailed case study, and analysed to establish its merits and limitations. The analysis is then used as a basis for discussing the challenges and opportunities of this endeavour to transition the development of autonomic IT systems from the current practice of using ad-hoc methods and heuristic towards a more principled approach. © 2012, IGI Global.
Resumo:
Purpose: Several studies have suggested accommodative lags may serve as a stimulus for myopic growth, and while a blurred foveal image is believed to the main stimulus for accommodation, spectral composition of the retinal image is also believed to influence accommodative accuracy. Of particular interest is how altering spectral lighting conditions influences accommodation in the presence of soft multifocal contact lenses, which are currently being used off-label for myopia control. Methods: Accommodative responses were assessed using a Grand Seiko WAM-5500 autorefractor for four target distances: 25, 33, 50, and 100cm for 30 young adult subjects (14 myopic, 16 emmetropic; mean refractive errors (±SD, D) -4.22±2.04 and -0.15±0.67 respectively). Measurements were obtained with four different soft contact lenses, Single vision distance (SVD), Single vision near (SVN), Centre-Near (CN) and Centre-Distance (CD) (+1.50 add), and three different lighting conditions: red (peak λ 632nm), blue (peak λ 460nm), and white (peak λ 560nm). Corrections for chromatic differences in refraction were made prior to calculating accommodative errors. Results: The size of accommodative errors was significantly affected by lens design (p<0.001), lighting (p=0.027), and target distance (p=0.009). Mean accommodative errors were significantly larger with the SV lenses compared to the CD and CN designs (p<0.001). Errors were also significantly larger under blue light compared to white (p=0.004) and a significant interaction noted between lens design and lighting (p<0.001). Blue light generally decreased accommodative lags and increased accommodative leads relative to white and red light, the opposite was true of red light (p≤0.001). Lens design also significantly influenced direction of accommodative error (i.e. lag or lead) (p<0.001). Interactions with or between refractive groups were not found to be statistically significant for either the magnitude or direction of accommodative error (p>0.05 for all). Conclusions: Accuracy of accommodation is affected by both lens design and by wavelength of lighting. These accommodative lag data lend some support to recent speculation about the potential therapeutic value of lighting with a spectral bias towards blue during near work for myopia, although such treatment effects are likely to be more subtle under broad compared to the narrow spectrum lighting conditions used here.
Resumo:
Cloud computing realizes the long-held dream of converting computing capability into a type of utility. It has the potential to fundamentally change the landscape of the IT industry and our way of life. However, as cloud computing expanding substantially in both scale and scope, ensuring its sustainable growth is a critical problem. Service providers have long been suffering from high operational costs. Especially the costs associated with the skyrocketing power consumption of large data centers. In the meantime, while efficient power/energy utilization is indispensable for the sustainable growth of cloud computing, service providers must also satisfy a user's quality of service (QoS) requirements. This problem becomes even more challenging considering the increasingly stringent power/energy and QoS constraints, as well as other factors such as the highly dynamic, heterogeneous, and distributed nature of the computing infrastructures, etc. ^ In this dissertation, we study the problem of delay-sensitive cloud service scheduling for the sustainable development of cloud computing. We first focus our research on the development of scheduling methods for delay-sensitive cloud services on a single server with the goal of maximizing a service provider's profit. We then extend our study to scheduling cloud services in distributed environments. In particular, we develop a queue-based model and derive efficient request dispatching and processing decisions in a multi-electricity-market environment to improve the profits for service providers. We next study a problem of multi-tier service scheduling. By carefully assigning sub deadlines to the service tiers, our approach can significantly improve resource usage efficiencies with statistically guaranteed QoS. Finally, we study the power conscious resource provision problem for service requests with different QoS requirements. By properly sharing computing resources among different requests, our method statistically guarantees all QoS requirements with a minimized number of powered-on servers and thus the power consumptions. The significance of our research is that it is one part of the integrated effort from both industry and academia to ensure the sustainable growth of cloud computing as it continues to evolve and change our society profoundly.^
Resumo:
Acknowledgements We wish to express our gratitude to the National Geographic Society and the National Research Foundation of South Africa for funding the discovery, recovery, and analysis of the H. naledi material. The study reported here was also made possible by grants from the Social Sciences and Humanities Research Council of Canada, the Canada Foundation for Innovation, the British Columbia Knowledge Development Fund, the Canada Research Chairs Program, Simon Fraser University, the DST/NRF Centre of Excellence in Palaeosciences (COE-Pal), as well as by a Discovery Grant from the Natural Sciences and Engineering Research Council of Canada, a Young Scientist Development Grant from the Paleontological Scientific Trust (PAST), a Baldwin Fellowship from the L.S.B. Leakey Foundation, and a Seed Grant and a Cornerstone Faculty Fellowship from the Texas A&M University College of Liberal Arts. We would like to thank the South African Heritage Resource Agency for the permits necessary to work on the Rising Star site; the Jacobs family for granting access; Wilma Lawrence, Bonita De Klerk, Merrill Van der Walt, and Justin Mukanku for their assistance during all phases of the project; Lucas Delezene for valuable discussion on the dental characters of H. naledi. We would also like to thank Peter Schmid for the preparation of the Dinaledi fossil material; Yoel Rak for explaining in detail some of the characters used in previous studies; William Kimbel for drawing our attention to the possibility that there might be a problem with Dembo et al.’s (2015) codes for the two characters related to the articular eminence; Will Stein for helpful discussion about the Bayesian analyses; Mike Lee for his comments on this manuscript; John Hawks for his support in organizing the Rising Star workshop; and the associate editor and three anonymous reviewers for their valuable comments. We are grateful to S. Potze and the Ditsong Museum, B. Billings and the School of Anatomical Sciences at the University of the Witwatersrand, and B. Zipfel and the Evolutionary Studies Institute at the University of the Witwatersrand for providing access to the specimens in their care; the University of the Witwatersrand, the Evolutionary Studies Institute, and the South African National Centre of Excellence in PalaeoSciences for hosting a number of the authors while studying the material; and the Western Canada Research Grid for providing access to the high-performance computing facilities for the Bayesian analyses. Last but definitely not least, we thank the head of the Rising Star project, Lee Berger, for his leadership and support, and for encouraging us to pursue the study reported here.
Resumo:
Peer reviewed
Resumo:
Conductive AFM and in situ methods were used to determine contact resistance and resistivity of individual Sb2S3 nanowires. Nanowires were deposited on oxidized Si surface for in situ measurements and on Si surface with macroelectrodes for conductive AFM (C-AFM) measurements. Contact resistance was determined by measurement of I(V) characteristics at different distances from the nanowire contact with the macroelectrode and resistivity of nanowires was determined. Sb2S3 is a soft material with low adhesion force to the surface and therefore special precautions were taken during measurements.
Resumo:
A RET network consists of a network of photo-active molecules called chromophores that can participate in inter-molecular energy transfer called resonance energy transfer (RET). RET networks are used in a variety of applications including cryptographic devices, storage systems, light harvesting complexes, biological sensors, and molecular rulers. In this dissertation, we focus on creating a RET device called closed-diffusive exciton valve (C-DEV) in which the input to output transfer function is controlled by an external energy source, similar to a semiconductor transistor like the MOSFET. Due to their biocompatibility, molecular devices like the C-DEVs can be used to introduce computing power in biological, organic, and aqueous environments such as living cells. Furthermore, the underlying physics in RET devices are stochastic in nature, making them suitable for stochastic computing in which true random distribution generation is critical.
In order to determine a valid configuration of chromophores for the C-DEV, we developed a systematic process based on user-guided design space pruning techniques and built-in simulation tools. We show that our C-DEV is 15x better than C-DEVs designed using ad hoc methods that rely on limited data from prior experiments. We also show ways in which the C-DEV can be improved further and how different varieties of C-DEVs can be combined to form more complex logic circuits. Moreover, the systematic design process can be used to search for valid chromophore network configurations for a variety of RET applications.
We also describe a feasibility study for a technique used to control the orientation of chromophores attached to DNA. Being able to control the orientation can expand the design space for RET networks because it provides another parameter to tune their collective behavior. While results showed limited control over orientation, the analysis required the development of a mathematical model that can be used to determine the distribution of dipoles in a given sample of chromophore constructs. The model can be used to evaluate the feasibility of other potential orientation control techniques.
Resumo:
Cloud computing realizes the long-held dream of converting computing capability into a type of utility. It has the potential to fundamentally change the landscape of the IT industry and our way of life. However, as cloud computing expanding substantially in both scale and scope, ensuring its sustainable growth is a critical problem. Service providers have long been suffering from high operational costs. Especially the costs associated with the skyrocketing power consumption of large data centers. In the meantime, while efficient power/energy utilization is indispensable for the sustainable growth of cloud computing, service providers must also satisfy a user's quality of service (QoS) requirements. This problem becomes even more challenging considering the increasingly stringent power/energy and QoS constraints, as well as other factors such as the highly dynamic, heterogeneous, and distributed nature of the computing infrastructures, etc. In this dissertation, we study the problem of delay-sensitive cloud service scheduling for the sustainable development of cloud computing. We first focus our research on the development of scheduling methods for delay-sensitive cloud services on a single server with the goal of maximizing a service provider's profit. We then extend our study to scheduling cloud services in distributed environments. In particular, we develop a queue-based model and derive efficient request dispatching and processing decisions in a multi-electricity-market environment to improve the profits for service providers. We next study a problem of multi-tier service scheduling. By carefully assigning sub deadlines to the service tiers, our approach can significantly improve resource usage efficiencies with statistically guaranteed QoS. Finally, we study the power conscious resource provision problem for service requests with different QoS requirements. By properly sharing computing resources among different requests, our method statistically guarantees all QoS requirements with a minimized number of powered-on servers and thus the power consumptions. The significance of our research is that it is one part of the integrated effort from both industry and academia to ensure the sustainable growth of cloud computing as it continues to evolve and change our society profoundly.