884 resultados para Theory of conceptual fields
Resumo:
Researchers suggest that personalization on the Semantic Web adds up to a Web 3.0 eventually. In this Web, personalized agents process and thus generate the biggest share of information rather than humans. In the sense of emergent semantics, which supplements traditional formal semantics of the Semantic Web, this is well conceivable. An emergent Semantic Web underlying fuzzy grassroots ontology can be accomplished through inducing knowledge from users' common parlance in mutual Web 2.0 interactions [1]. These ontologies can also be matched against existing Semantic Web ontologies, to create comprehensive top-level ontologies. On the Web, if augmented with information in the form of restrictions andassociated reliability (Z-numbers) [2], this collection of fuzzy ontologies constitutes an important basis for an implementation of Zadeh's restriction-centered theory of reasoning and computation (RRC) [3]. By considering real world's fuzziness, RRC differs from traditional approaches because it can handle restrictions described in natural language. A restriction is an answer to a question of the value of a variable such as the duration of an appointment. In addition to mathematically well-defined answers, RRC can likewise deal with unprecisiated answers as "about one hour." Inspired by mental functions, it constitutes an important basis to leverage present-day Web efforts to a natural Web 3.0. Based on natural language information, RRC may be accomplished with Z-number calculation to achieve a personalized Web reasoning and computation. Finally, through Web agents' understanding of natural language, they can react to humans more intuitively and thus generate and process information.
Resumo:
Objective. The purpose of the study is to provide a holistic depiction of behavioral & environmental factors contributing to risky sexual behaviors among predominantly high school educated, low-income African Americans residing in urban areas of Houston, TX utilizing the Theory of Gender and Power, Situational/Environmental Variables Theory, and Sexual Script Theory. Methods. A cross-sectional study was conducted via questionnaires among 215 Houston area residents, 149 were women and 66 were male. Measures used to assess behaviors of the population included a history of homelessness, use of crack/cocaine among several other illicit drugs, the type of sexual partner, age of participant, age of most recent sex partner, whether or not participants sought health care in the last 12 months, knowledge of partner's other sexual activities, symptoms of depression, and places where partner's were met. In an effort to determine risk of sexual encounters, a risk index employing the variables used to assess condom use was created categorizing sexual encounters as unsafe or safe. Results. Variables meeting the significance level of p<.15 for the bivariate analysis of each theory were entered into a binary logistic regression analysis. The block for each theory was significant, suggesting that the grouping assignments of each variable by theory were significantly associated with unsafe sexual behaviors. Within the regression analysis, variables such as sex for drugs/money, low income, and crack use demonstrated an effect size of ≥ ± 1, indicating that these variables had a significant effect on unsafe sexual behavioral practices. Conclusions. Variables assessing behavior and environment demonstrated a significant effect when categorized by relation to designated theories.
Resumo:
In order to fully describe the construct of empowerment and to determine possible measures for this construct in racially and ethnically diverse neighborhoods, a qualitative study based on Grounded Theory was conducted at both the individual and collective levels. Participants for the study included 49 grassroots experts on community empowerment who were interviewed through semi-structured interviews and focus groups. The researcher also conducted field observations as part of the research protocol.^ The results of the study identified benchmarks of individual and collective empowerment and hundreds of possible markers of collective empowerment applicable in diverse communities. Results also indicated that community involvement is essential in the selection and implementation of proper measures. Additional findings were that the construct of empowerment involves specific principles of empowering relationships and particular motivational factors. All of these findings lead to a two dimensional model of empowerment based on the concepts of relationships among members of a collective body and the collective body's desire for socio-political change.^ These results suggest that the design, implementation, and evaluation of programs that foster empowerment must be based on collaborative ventures between the population being served and program staff because of the interactive, synergistic nature of the construct. In addition, empowering programs should embrace specific principles and processes of individual and collective empowerment in order to maximize their effectiveness and efficiency. And finally, the results suggest that collaboratively choosing markers to measure the processes and outcomes of empowerment in the main systems and populations living in today's multifaceted communities is a useful mechanism to determine change. ^
Resumo:
Purpose: In homeopathy or anthroposophically extended medicine high dilutions are used. They showed significant differences in ultraviolet light (UV) transmission between controls and different dilution levels. Exposing such dilutions to physical factors such as UV light or elevated temperature (37�C) yielded significantly different UV transmissions values compared to unexposed dilutions. The aim was to test whether electromagnetic fields (EMF) of a mobile phone affect the UV absorbance of dilutions of Atropa belladonna (Ab) and quartz. Methods: Commercially available dilutions of Ab 4x, 6x, 12x, 15x, 30x and of quartz 6x, 12x, 15x, 30x were investigated. On 5 days, 4 samples of each dilution were exposed to the EMF by a mobile phone at 900MHz (GSM) with an output power of 2W for 3 h. Control samples were kept in a separate room. UV-absorbance of the samples in the range from 190 to 340 nm was measured in randomized order. The average absorbance from 200 to 340 nm and from 200 to 240 nm was compared between exposed and unexposed samples by a dependent t-test. Results: Between unexposed and exposed dilutions of Ab and quartz no significant differences were detected, except for quartz 12x over the range from 200 to 340 nm. Conclusion: Exposure of high dilutions of Ab and quartz to GSM EMF of a mobile phone did not alter UV absorbance of these dilutions.
Resumo:
Recent experiments revealed that the fruit fly Drosophila melanogaster has a dedicated mechanism for forgetting: blocking the G-protein Rac leads to slower and activating Rac to faster forgetting. This active form of forgetting lacks a satisfactory functional explanation. We investigated optimal decision making for an agent adapting to a stochastic environment where a stimulus may switch between being indicative of reward or punishment. Like Drosophila, an optimal agent shows forgetting with a rate that is linked to the time scale of changes in the environment. Moreover, to reduce the odds of missing future reward, an optimal agent may trade the risk of immediate pain for information gain and thus forget faster after aversive conditioning. A simple neuronal network reproduces these features. Our theory shows that forgetting in Drosophila appears as an optimal adaptive behavior in a changing environment. This is in line with the view that forgetting is adaptive rather than a consequence of limitations of the memory system.
Resumo:
We define an applicative theory of truth TPT which proves totality exactly for the polynomial time computable functions. TPT has natural and simple axioms since nearly all its truth axioms are standard for truth theories over an applicative framework. The only exception is the axiom dealing with the word predicate. The truth predicate can only reflect elementhood in the words for terms that have smaller length than a given word. This makes it possible to achieve the very low proof-theoretic strength. Truth induction can be allowed without any constraints. For these reasons the system TPT has the high expressive power one expects from truth theories. It allows embeddings of feasible systems of explicit mathematics and bounded arithmetic. The proof that the theory TPT is feasible is not easy. It is not possible to apply a standard realisation approach. For this reason we develop a new realisation approach whose realisation functions work on directed acyclic graphs. In this way, we can express and manipulate realisation information more efficiently.
Development of meta-representations: Procedural metacognition and the relationship to Theory of Mind
Resumo:
In several studies it was shown that metacognitive ability is crucial for children and their success in school. Much less is known about the emergence of that ability and its relationship to other meta-representations like Theory of Mind competencies. In the past years, a growing literature has suggested that metacognition and Theory of Mind could theoretically be assumed to belong to the same developmental concept. Since then only a few studies showed empirically evidence that metacognition and Theory of Mind are related. But these studies focused on declarative metacognitive knowledge rather than on procedural metacognitive monitoring like in the present study: N = 159 children were first tested shortly before making the transition to school (aged between 5 1/2 and 7 1/2 years) and one year later at the end of their first grade. Analyses suggest that there is in fact a significant relation between early metacognitive monitoring skills (procedural metacognition) and later Theory of Mind competencies. Notably, language seems to play a crucial role in this relationship. Thus our results bring new insights in the research field of the development of meta-representation and support the view that metacognition and Theory of Mind are indeed interrelated, but the precise mechanisms yet remain unclear.
Resumo:
OBJECTIVE Obtaining new details of radial motion of left ventricular (LV) segments using velocity-encoding cardiac MRI. METHODS Cardiac MR examinations were performed on 14 healthy volunteers aged between 19 and 26 years. Cine images for navigator-gated phase contrast velocity mapping were acquired using a black blood segmented κ-space spoiled gradient echo sequence with a temporal resolution of 13.8 ms. Peak systolic and diastolic radial velocities as well as radial velocity curves were obtained for 16 ventricular segments. RESULTS Significant differences among peak radial velocities of basal and mid-ventricular segments have been recorded. Particular patterns of segmental radial velocity curves were also noted. An additional wave of outward radial movement during the phase of rapid ventricular filling, corresponding to the expected timing of the third heart sound, appeared of particular interest. CONCLUSION The technique has allowed visualization of new details of LV radial wall motion. In particular, higher peak systolic radial velocities of anterior and inferior segments are suggestive of a relatively higher dynamics of anteroposterior vs lateral radial motion in systole. Specific patterns of radial motion of other LV segments may provide additional insights into LV mechanics. ADVANCES IN KNOWLEDGE The outward radial movement of LV segments impacted by the blood flow during rapid ventricular filling provides a potential substrate for the third heart sound. A biphasic radial expansion of the basal anteroseptal segment in early diastole is likely to be related to the simultaneous longitudinal LV displacement by the stretched great vessels following repolarization and their close apposition to this segment.
Resumo:
Quarks were introduced 50 years ago opening the road towards our understanding of the elementary constituents of matter and their fundamental interactions. Since then, a spectacular progress has been made with important discoveries that led to the establishment of the Standard Theory that describes accurately the basic constituents of the observable matter, namely quarks and leptons, interacting with the exchange of three fundamental forces, the weak, electromagnetic and strong force. Particle physics is now entering a new era driven by the quest of understanding of the composition of our Universe such as the unobservable (dark) matter, the hierarchy of masses and forces, the unification of all fundamental interactions with gravity in a consistent quantum framework, and several other important questions. A candidate theory providing answers to many of these questions is string theory that replaces the notion of point particles by extended objects, such as closed and open strings. In this short note, I will give a brief overview of string unification, describe in particular how quarks and leptons can emerge and discuss what are possible predictions for particle physics and cosmology that could test these ideas.