829 resultados para Two Approaches
Resumo:
This paper describes the process of wrapping existing scientific codes in the domain of plasma physics simulations through the use of the Sun’s Java Native Interface. We have created a Java front-end for a particular functionality, offered by legacy native libraries, in order to achieve reusability and interoperability without having to rewrite these libraries. The technique, introduced in this paper, includes two approaches – the one-to-one mapping for wrapping a number of native functions, and using peer classes for wrapping native data structures.
Resumo:
Students on introductory courses in programming languages often experience difficulty understanding the basic principles of procedural programming. In this paper we discuss the importance of early understanding of the subroutine mechanism. Two approaches for self-training – static and dynamic - are presented and compared. The static approach is appropriate for written text in paper textbook. The dynamic approach is suitable for interactive training using a computer. An interactive module was developed for teaching subroutines.
Resumo:
Volunteered Service Composition (VSC) refers to the process of composing volunteered services and resources. These services are typically published to a pool of voluntary resources. The composition aims at satisfying some objectives (e.g. Utilizing storage and eliminating waste, sharing space and optimizing for energy, reducing computational cost etc.). In cases when a single volunteered service does not satisfy a request, VSC will be required. In this paper, we contribute to three approaches for composing volunteered services: these are exhaustive, naïve and utility-based search approach to VSC. The proposed new utility-based approach, for instance, is based on measuring the utility that each volunteered service can provide to each request and systematically selects the one with the highest utility. We found that the utility-based approach tend to be more effective and efficient when selecting services, while minimizing resource waste when compared to the other two approaches.
Resumo:
Kalin Georgiev, Dimitrina Stavrova - We consider two approaches for introducing topological notions in a course of Computational Topology. One is an intuitive inductive introduction of a notion; the other is theoretical-analytical one. As an example we treat the notions of interior, closure and boundary of a set in a topological space. We analyze several visual representations as well as analytical ones. Examples of tests and quiz problems are considered. Comparisons of student’s achievement on different type of problems are presented.
Resumo:
Over the past 50 years there has been considerable progress in our understanding of biomolecular interactions at an atomic level. This in turn has allowed molecular simulation methods employing full atomistic modeling at ever larger scales to develop. However, some challenging areas still remain where there is either a lack of atomic resolution structures or where the simulation system is inherently complex. An area where both challenges are present is that of membranes containing membrane proteins. In this review we analyse a new practical approach to membrane protein study that offers a potential new route to high resolution structures and the possibility to simplify simulations. These new approaches collectively recognise that preservation of the interaction between the membrane protein and the lipid bilayer is often essential to maintain structure and function. The new methods preserve these interactions by producing nano-scale disc shaped particles that include bilayer and the chosen protein. Currently two approaches lead in this area: the MSP system that relies on peptides to stabilise the discs, and SMALPs where an amphipathic styrene maleic acid copolymer is used. Both methods greatly enable protein production and hence have the potential to accelerate atomic resolution structure determination as well as providing a simplified format for simulations of membrane protein dynamics.
Resumo:
A dolgozatban az ellátási láncokban meglévő diadikus kapcsolatok minőségét állítjuk a vizsgálatok középpontjába. Az irodalomban számtalan megközelítés ismert az ellátási lánc kapcsolatok fejlődésének leírására. Ezen fejlődési elméletek inkább elméleti szinten írják le a diadikus kapcsolatok változását, annak empirikus tesztelhetőségét nem vizsgálják. Dolgozatunkban kísérletet teszünk az ellátási lánc kapcsolatok fejlődésének empirikus vizsgálatára. Arra próbálunk választ találni, hogy az életciklus hipotézis az üzleti kapcsolatok időbeli fejlődésére alkalmazható-e. = Our paper combines two approaches using data of an internet based questionnaire and applying quantitative analysis it tests the hypothesis business relationship development in time can be described with the concept of life cycle. The concept of life cycle is widely used in business research. Among others the diffusion of innovation is described using this concept, or the concept of product life cycle just to name a few. All of these researches analyze the life cycle along a specific variable (for example the volume of sales or revenue in case of the product life cycle) which (except the last stage of the cycle, the decline) has a cumulative character resulting in the widely known specific shape of a life cycle. Consequently testing a life cycle hypothesis inevitably means the acceptance of some type cumulativity in the development.
Resumo:
Computer scientists and social scientists consider the political districting problem from different viewpoints. This paper gives an overview of both strands of the literature on districting in which the connections and the differences between the two approaches are highlighted.
Resumo:
The concept of types was introduced by Harsányi[8]. In the literature there are two approaches for formalizing types, type spaces: the purely measurable and the topological models. In the former framework Heifetz and Samet [11] showed that the universal type space exists and later Meier[13] proved that it is complete. In this paper we examine the topological approach and conclude that there is no universal topological type space in the category of topological type spaces.
Resumo:
A Szolvencia II néven említett új irányelv elfogadása az Európai Unióban új helyzetet teremt a biztosítók tőkeszükséglet-számításánál. A tanulmány a biztosítók működését modellezve azt elemzi, hogyan hatnak a biztosítók állományának egyes jellemzői a tőkeszükséglet értékére egy olyan elméleti modellben, amelyben a tőkeszükséglet-értékek a Szolvencia II szabályok alapján számolhatók. A modellben biztosítási illetve pénzügyi kockázati "modul" figyelembevételére kerül sor külön-külön számolással, illetve a két kockázatfajta közös modellben való együttes figyelembevételével (a Szolvencia II eredményekkel való összehasonlításhoz). Az elméleti eredmények alapján megállapítható, hogy a tőkeszükségletre vonatkozóan számolható értékek eltérhetnek e két esetben. Az eredmények alapján lehetőség van az eltérések hátterében álló tényezők tanulmányozására is. ____ The new Solvency II directive results in a new environment for calculating the solvency capital requirement of insurance companies in the European Union. By modelling insurance companies the study analyses the impact of certain characteristics of insurance population on the solvency capital based on Solvency II rules. The model includes insurance and financial risk module by calculating solvency capital for the given risk types separately and together, respectively. Based on the theoretical results the difference between these two approaches can be observed. Based on the results the analysis of factors in°uencing the differences is also possible.
Resumo:
Database design is a difficult problem for non-expert designers. It is desirable to assist such designers during the problem solving process by means of a knowledge based (KB) system. A number of prototype KB systems have been proposed, however there are many shortcomings. Few have incorporated sufficient expertise in modeling relationships, particularly higher order relationships. There has been no empirical study that experimentally tested the effectiveness of any of these KB tools. Problem solving behavior of non-experts, whom the systems were intended to assist, has not been one of the bases for system design. In this project a consulting system for conceptual database design that addresses the above short comings was developed and empirically validated.^ The system incorporates (a) findings on why non-experts commit errors and (b) heuristics for modeling relationships. Two approaches to knowledge base implementation--system restrictiveness and decisional guidance--were used and compared in this project. The Restrictive approach is proscriptive and limits the designer's choices at various design phases by forcing him/her to follow a specific design path. The Guidance system approach which is less restrictive, provides context specific, informative and suggestive guidance throughout the design process. The main objectives of the study are to evaluate (1) whether the knowledge-based system is more effective than a system without the knowledge-base and (2) which knowledge implementation--restrictive or guidance--strategy is more effective. To evaluate the effectiveness of the knowledge base itself, the two systems were compared with a system that does not incorporate the expertise (Control).^ The experimental procedure involved the student subjects solving a task without using the system (pre-treatment task) and another task using one of the three systems (experimental task). The experimental task scores of those subjects who performed satisfactorily in the pre-treatment task were analyzed. Results are (1) The knowledge based approach to database design support lead to more accurate solutions than the control system; (2) No significant difference between the two KB approaches; (3) Guidance approach led to best performance; and (4) The subjects perceived the Restrictive system easier to use than the Guidance system. ^
Resumo:
The increasing amount of available semistructured data demands efficient mechanisms to store, process, and search an enormous corpus of data to encourage its global adoption. Current techniques to store semistructured documents either map them to relational databases, or use a combination of flat files and indexes. These two approaches result in a mismatch between the tree-structure of semistructured data and the access characteristics of the underlying storage devices. Furthermore, the inefficiency of XML parsing methods has slowed down the large-scale adoption of XML into actual system implementations. The recent development of lazy parsing techniques is a major step towards improving this situation, but lazy parsers still have significant drawbacks that undermine the massive adoption of XML. ^ Once the processing (storage and parsing) issues for semistructured data have been addressed, another key challenge to leverage semistructured data is to perform effective information discovery on such data. Previous works have addressed this problem in a generic (i.e. domain independent) way, but this process can be improved if knowledge about the specific domain is taken into consideration. ^ This dissertation had two general goals: The first goal was to devise novel techniques to efficiently store and process semistructured documents. This goal had two specific aims: We proposed a method for storing semistructured documents that maps the physical characteristics of the documents to the geometrical layout of hard drives. We developed a Double-Lazy Parser for semistructured documents which introduces lazy behavior in both the pre-parsing and progressive parsing phases of the standard Document Object Model’s parsing mechanism. ^ The second goal was to construct a user-friendly and efficient engine for performing Information Discovery over domain-specific semistructured documents. This goal also had two aims: We presented a framework that exploits the domain-specific knowledge to improve the quality of the information discovery process by incorporating domain ontologies. We also proposed meaningful evaluation metrics to compare the results of search systems over semistructured documents. ^
Resumo:
Given the significant amount of attention placed upon race within our society, racial identity long has been nominated as a meaningful influence upon human development (Cross, 1971; Sellers et al., 1998). Scholars investigating aspects of racial identity have largely pursued one of two lines of research: (a) describing factors and processes that contribute to the development of racial identities, or (b) empirically documenting associations between particular racial identities and key adjustment outcomes. However, few studies have integrated these two approaches to simultaneously evaluate developmental and related adjustment aspects of racial identity among minority youth. Consequently, relations between early racial identity developmental processes and correlated adjustment outcomes remain ambiguous. Even less is known regarding the direction and function of these relationships during adolescence. To address this gap, the present study examined key multivariate associations between (a) distinct profiles of racial identity salience and (b) adjustment outcomes within a community sample of African-American youth. Specifically, a person-centered analytic approach (i.e., cluster analysis) was employed to conduct a secondary analysis of two archived databases containing longitudinal data measuring levels of racial identity salience and indices of psychosocial adjustment among youth at four different measurement occasions.^ Four separate groups of analyses were conducted to investigate (a) the existence of within-group differences in levels of racial identity salience, (b) shifts among distinct racial identity types between contiguous times of measurement, (c) adjustment correlates of racial identity types at each time of measurement, and (d) predictive relations between racial identity clusters and adjustment outcomes, respectively. Results indicated significant heterogeneity in patterns of racial identity salience among these African-American youth as well as significant discontinuity in the patterns of shifts among identity profiles between contiguous measurement occasions. In addition, within developmental stages, levels of racial identity salience were associated with several adjustment outcomes, suggesting the protective value of high levels of endorsement or internalization of racial identity among the sampled youth. Collectively, these results illustrated the significance of racial identity salience as a meaningful developmental construct in the lives of African-American adolescents, the implications of which are discussed for racial identity and practice-related research literatures. ^
Resumo:
The increasing amount of available semistructured data demands efficient mechanisms to store, process, and search an enormous corpus of data to encourage its global adoption. Current techniques to store semistructured documents either map them to relational databases, or use a combination of flat files and indexes. These two approaches result in a mismatch between the tree-structure of semistructured data and the access characteristics of the underlying storage devices. Furthermore, the inefficiency of XML parsing methods has slowed down the large-scale adoption of XML into actual system implementations. The recent development of lazy parsing techniques is a major step towards improving this situation, but lazy parsers still have significant drawbacks that undermine the massive adoption of XML. Once the processing (storage and parsing) issues for semistructured data have been addressed, another key challenge to leverage semistructured data is to perform effective information discovery on such data. Previous works have addressed this problem in a generic (i.e. domain independent) way, but this process can be improved if knowledge about the specific domain is taken into consideration. This dissertation had two general goals: The first goal was to devise novel techniques to efficiently store and process semistructured documents. This goal had two specific aims: We proposed a method for storing semistructured documents that maps the physical characteristics of the documents to the geometrical layout of hard drives. We developed a Double-Lazy Parser for semistructured documents which introduces lazy behavior in both the pre-parsing and progressive parsing phases of the standard Document Object Model's parsing mechanism. The second goal was to construct a user-friendly and efficient engine for performing Information Discovery over domain-specific semistructured documents. This goal also had two aims: We presented a framework that exploits the domain-specific knowledge to improve the quality of the information discovery process by incorporating domain ontologies. We also proposed meaningful evaluation metrics to compare the results of search systems over semistructured documents.
Resumo:
Database design is a difficult problem for non-expert designers. It is desirable to assist such designers during the problem solving process by means of a knowledge based (KB) system. Although a number of prototype KB systems have been proposed, there are many shortcomings. Firstly, few have incorporated sufficient expertise in modeling relationships, particularly higher order relationships. Secondly, there does not seem to be any published empirical study that experimentally tested the effectiveness of any of these KB tools. Thirdly, problem solving behavior of non-experts, whom the systems were intended to assist, has not been one of the bases for system design. In this project, a consulting system, called CODA, for conceptual database design that addresses the above short comings was developed and empirically validated. More specifically, the CODA system incorporates (a) findings on why non-experts commit errors and (b) heuristics for modeling relationships. Two approaches to knowledge base implementation were used and compared in this project, namely system restrictiveness and decisional guidance (Silver 1990). The Restrictive system uses a proscriptive approach and limits the designer's choices at various design phases by forcing him/her to follow a specific design path. The Guidance system approach, which is less restrictive, involves providing context specific, informative and suggestive guidance throughout the design process. Both the approaches would prevent erroneous design decisions. The main objectives of the study are to evaluate (1) whether the knowledge-based system is more effective than the system without a knowledge-base and (2) which approach to knowledge implementation - whether Restrictive or Guidance - is more effective. To evaluate the effectiveness of the knowledge base itself, the systems were compared with a system that does not incorporate the expertise (Control). An experimental procedure using student subjects was used to test the effectiveness of the systems. The subjects solved a task without using the system (pre-treatment task) and another task using one of the three systems, viz. Control, Guidance or Restrictive (experimental task). Analysis of experimental task scores of those subjects who performed satisfactorily in the pre-treatment task revealed that the knowledge based approach to database design support lead to more accurate solutions than the control system. Among the two KB approaches, Guidance approach was found to lead to better performance when compared to the Control system. It was found that the subjects perceived the Restrictive system easier to use than the Guidance system.
Resumo:
Advanced age may become a limiting factor for the maintenance of rhythms in organisms, reducing the capacity of generation and synchronization of biological rhythms. In this study, the influence of aging on the expression of endogenous periodicity and synchronization (photic and social) of the circadian activity rhythm (CAR) was evaluated in a diurnal primate, the marmoset (Callithrix jacchus). This study had two approaches: one with longitudinal design, performed with a male marmoset in two different phases: adult (three years) and older (9 y.o.) (study 1) and the second, a transversal approach, with 6 old (♂: 9.7 ± 2.0 y.o.) and 11 adults animals (♂: 4.2 ± 0.8 y.o.) (study 2). The evaluation of the photic synchronization involved two conditions in LD (natural and artificial illuminations). In study 1, the animal was subjected to the following stages: LD (12:12 ~ 350: ~ 2 lx), LL (~ 350 lx) and LD resynchronization. In the second study, the animals were initially evaluated in natural LD, and then the same sequence stages of study 1. During the LL stage in study 2, the vocalizations of conspecifics kept in natural LD on the outside of the colony were considered temporal cue to the social synchronization. The record of the activity was performed automatically at intervals of five minutes through infrared sensor and actimeters, in studies 1 and 2, respectively. In general, the aged showed a more fragmented activity pattern (> IV < H and > PSD, ANOVA, p < 0.05), lower levels of activity (ANOVA, p < 0.05) and shorter duration of active phase (ANOVA, p < 0.05) in LD conditions, when compared to adults. In natural LD, the aged presented phase delay pronounced for onset and offset of active phase (ANOVA, p < 0.05), while the adults had the active phase more adjusted to light phase. Under artificial LD, there was phase advance and greater adjustment of onset and offset of activity in relation to the LD in the aged (ANOVA, p < 0.05). In LL, there was a positive correlation between age and the endogenous period () in the first 20 days (Spearman correlation, p < 0.05), with prolonged held in two aged animals. In this condition, most adults showed free-running period of the circadian activity rhythm with < 24 h for the first 30 days and later on relative coordination mediated by auditory cues. In study 2, the cross-correlation analysis between the activity profiles of the animals in LL with control animals kept under natural LD, found that there was less social synchronization in the aged. With the resubmission to the LD, the resynchronization rate was slower in the aged (t-test; p < 0.05) and in just one aged animal there was a loss of resynchronization capability. According to the data set, it is suggested that the aging in marmosets may be related to: 1) lower amplitude and greater fragmentation of the activity, accompanied to phase delay with extension of period, caused by changes in a photic input, in the generation and behavioral expression of the CAR; 2) lower capacity of the circadian activity rhythm to photic synchronization, that can become more robust in artificial lighting conditions, possibly due to the higher light intensities at the beginning of the active phase due to the abrupt transitions between the light and dark phases; and 3) smaller capacity of non-photic synchronization for auditory cues from conspecifics, possibly due to reducing sensory inputs and responsiveness of the circadian oscillators to auditory cues, what can make the aged marmoset most vulnerable, as these social cues may act as an important supporting factor for the photic synchronization.