904 resultados para Multi-dimensional database
Resumo:
Much research pursues machine intelligence through better representation of semantics. What is semantics? People in different areas view semantics from different facets although it accompanies interaction through civilization. Some researchers believe that humans have some innate structure in mind for processing semantics. Then, what the structure is like? Some argue that humans evolve a structure for processing semantics through constant learning. Then, how the process is like? Humans have invented various symbol systems to represent semantics. Can semantics be accurately represented? Turing machines are good at processing symbols according to algorithms designed by humans, but they are limited in ability to process semantics and to do active interaction. Super computers and high-speed networks do not help solve this issue as they do not have any semantic worldview and cannot reflect themselves. Can future cyber-society have some semantic images that enable machines and individuals (humans and agents) to reflect themselves and interact with each other with knowing social situation through time? This paper concerns these issues in the context of studying an interactive semantics for the future cyber-society. It firstly distinguishes social semantics from natural semantics, and then explores the interactive semantics in the category of social semantics. Interactive semantics consists of an interactive system and its semantic image, which co-evolve and influence each other. The semantic worldview and interactive semantic base are proposed as the semantic basis of interaction. The process of building and explaining semantic image can be based on an evolving structure incorporating adaptive multi-dimensional classification space and self-organized semantic link network. A semantic lens is proposed to enhance the potential of the structure and help individuals build and retrieve semantic images from different facets, abstraction levels and scales through time.
Resumo:
The main purpose of this dissertation is to assess the relation between municipal benchmarking and organisational learning with a specific emphasis on benchlearning and performance within municipalities and between groups of municipalities in the building and housing sector in the Netherlands. The first and main conclusion is that this relation exists, but that the relative success of different approaches to dimensions of change and organisational learning are a key explanatory factor for differences in the success of benchlearning. Seven other important conclusions could be derived from the empirical research. First, a combination of interpretative approaches at the group level with a mixture of hierarchical and network strategies, positively influences benchlearning. Second, interaction among professionals at the inter-organisational level strengthens benchlearning. Third, stimulating supporting factors can be seen as a more important strategy to strengthen benchlearning than pulling down barriers. Fourth, in order to facilitate benchlearning, intrinsic motivation and communication skills matter, and are supported by a high level of cooperation (i.e., team work), a flat organisational structure and interactions between individuals. Fifth, benchlearning is facilitated by a strategy that is based on a balanced use of episodic (emergent) and systemic (deliberate) forms of power. Sixth, high levels of benchlearning will be facilitated by an analyser or prospector strategic stance. Prospectors and analysers reach a different learning outcome than defenders and reactors. Whereas analysers and prospectors are willing to change policies when it is perceived as necessary, the strategic stances of defenders and reactors result in narrow process improvements (i.e., single-loop learning). Seventh, performance improvement is influenced by functional perceptions towards performance, and these perceptions ultimately influence the elements adopted. This research shows that efforts aimed at benchlearning and ultimately improved service delivery, should be directed to a multi-level and multi-dimensional approach addressing the context, content and process of dimensions of change and organisational learning.
Resumo:
Material processing using high-intensity femtosecond (fs) laser pulses is a fast developing technology holding potential for direct writing of multi-dimensional optical structures in transparent media. In this work we re-examine nonlinear diffraction theory in context of fs laser processing of silica in sub-critical (input power less than the critical power of self-focusing) regime. We have applied well known theory, developed by Vlasov, Petrishev and Talanov, that gives analytical description of the evolution of a root-mean-square beam (not necessarily Gaussian) width RRMS(z) in medium with the Kerr nonlinearity.
Resumo:
The publication represents a multi-dimensional and multi-faced, in depth assessment of the most significant determinants of the EU development as a political, economic and legal entity, in its format emerging from the Lisbon Treaty. The book represents an important contribution to our understanding of the most profound issues in the recent process of EU integration, including the issue of maintaining its cohesion and coherence under the stress of global challanges faced also by the European Union. Autohors formulated worthwhile conclusions of high value not only for academics but also for political decision-makers, which gives the book same competitive edge over its more theoretical and, hence, less practice-oriented, knack. The arumentation presented in the book would not be left without a reaction of the academic and/or professional circles. I take it almost for granted that the overall setting of the argumentation presented in it, as well as specific points made in its various chapters would find their adequate resonance in a high profile discussion likely to emerge after the book would have been published.
Resumo:
This paper presents a novel intonation modelling approach and demonstrates its applicability using the Standard Yorùbá language. Our approach is motivated by the theory that abstract and realised forms of intonation and other dimensions of prosody should be modelled within a modular and unified framework. In our model, this framework is implemented using the Relational Tree (R-Tree) technique. The R-Tree is a sophisticated data structure for representing a multi-dimensional waveform in the form of a tree. Our R-Tree for an utterance is generated in two steps. First, the abstract structure of the waveform, called the Skeletal Tree (S-Tree), is generated using tone phonological rules for the target language. Second, the numerical values of the perceptually significant peaks and valleys on the S-Tree are computed using a fuzzy logic based model. The resulting points are then joined by applying interpolation techniques. The actual intonation contour is synthesised by Pitch Synchronous Overlap Technique (PSOLA) using the Praat software. We performed both quantitative and qualitative evaluations of our model. The preliminary results suggest that, although the model does not predict the numerical speech data as accurately as contemporary data-driven approaches, it produces synthetic speech with comparable intelligibility and naturalness. Furthermore, our model is easy to implement, interpret and adapt to other tone languages.
Resumo:
Solving many scientific problems requires effective regression and/or classification models for large high-dimensional datasets. Experts from these problem domains (e.g. biologists, chemists, financial analysts) have insights into the domain which can be helpful in developing powerful models but they need a modelling framework that helps them to use these insights. Data visualisation is an effective technique for presenting data and requiring feedback from the experts. A single global regression model can rarely capture the full behavioural variability of a huge multi-dimensional dataset. Instead, local regression models, each focused on a separate area of input space, often work better since the behaviour of different areas may vary. Classical local models such as Mixture of Experts segment the input space automatically, which is not always effective and it also lacks involvement of the domain experts to guide a meaningful segmentation of the input space. In this paper we addresses this issue by allowing domain experts to interactively segment the input space using data visualisation. The segmentation output obtained is then further used to develop effective local regression models.
Resumo:
Algorithmic resources are considered for elaboration and identification of monotone functions and some alternate structures are brought, which are more explicit in sense of structure and quantities and which can serve as elements of practical identification algorithms. General monotone recognition is considered on multi- dimensional grid structure. Particular reconstructing problem is reduced to the monotone recognition through the multi-dimensional grid partitioning into the set of binary cubes.
Resumo:
Mathematics Subject Classification: 65C05, 60G50, 39A10, 92C37
Resumo:
Abstract (provisional): Background Failing a high-stakes assessment at medical school is a major event for those who go through the experience. Students who fail at medical school may be more likely to struggle in professional practice, therefore helping individuals overcome problems and respond appropriately is important. There is little understanding about what factors influence how individuals experience failure or make sense of the failing experience in remediation. The aim of this study was to investigate the complexity surrounding the failure experience from the student’s perspective using interpretative phenomenological analysis (IPA). Methods The accounts of 3 medical students who had failed final re-sit exams, were subjected to in-depth analysis using IPA methodology. IPA was used to analyse each transcript case-by-case allowing the researcher to make sense of the participant’s subjective world. The analysis process allowed the complexity surrounding the failure to be highlighted, alongside a narrative describing how students made sense of the experience. Results The circumstances surrounding students as they approached assessment and experienced failure at finals were a complex interaction between academic problems, personal problems (specifically finance and relationships), strained relationships with friends, family or faculty, and various mental health problems. Each student experienced multi-dimensional issues, each with their own individual combination of problems, but experienced remediation as a one-dimensional intervention with focus only on improving performance in written exams. What these students needed to be included was help with clinical skills, plus social and emotional support. Fear of termination of the their course was a barrier to open communication with staff. Conclusions These students’ experience of failure was complex. The experience of remediation is influenced by the way in which students make sense of failing. Generic remediation programmes may fail to meet the needs of students for whom personal, social and mental health issues are a part of the picture.
Resumo:
Fibre lasers are light sources that are synonymous with stability. They can give rise to highly coherent continuous-wave radiation, or a stable train of mode locked pulses with well-defined characteristics. However, they can also exhibit an exceedingly diverse range of nonlinear operational regimes spanning a multi-dimensional parameter space. The complex nature of the dynamics poses significant challenges in the theoretical and experimental studies of such systems. Here, we demonstrate how the real-time experimental methodology of spatio-temporal dynamics can be used to unambiguously identify and discern between such highly complex lasing regimes. This two-dimensional representation of laser intensity allows the identification and tracking of individual features embedded in the radiation as they make round-trip circulations inside the cavity. The salient features of this methodology are highlighted by its application to the case of Raman fibre lasers and a partially mode locked ring fibre laser operating in the normal dispersion regime.
Resumo:
The present study examines the extent to which blacks are segregated in the suburban community of Coconut Grove, Florida. Hypersegregation, or the general tendency for blacks and whites to live apart, was examined in terms of four distinct dimensions: evenness, exposure, clustering, and concentration. Together, these dimensions define the geographic traits of the target area. Alone these indices can not capture the multi-dimensional levels of segregation and, therefore, by themselves underestimate the severity of segregation and isolation in this community. This study takes a contemporary view of segregation in a Dade County community to see if segregation is the catalyst to the sometime cited violent response of blacks. This study yields results that support the information in the literature review and the thesis research questions sections namely, that the blacks within the Grove do respond violently to the negative effects that racial segregation causes. This thesis is unique in two ways. It examines segregation in a suburban environment rather than an urban inner city, and it presents a responsive analysis of the individuals studied, rather than relying only on demographic and statistical data. ^
Resumo:
This study evaluated inter- and intra-individual changes in acculturation, acculturative stress, and adaptation experiences, as well as their associations with adjustment outcomes among a group of Latino adolescents in South Florida. Specifically, the current study investigated the incidence, changes, and effects of stressors that arise from acculturation experiences (e.g., related to culture, discrimination, language difficulties) among Latino youth by employing a person-centered approach and a longitudinal research design. Four separate groups of analyses were conducted to investigate (a) within-group differences in levels of reported acculturative stress, (b) patterns of continuity and discontinuity in levels of acculturative stress across time, (c) adjustment outcomes associated with distinct patterns of acculturative stress within each measurement occasion, and (d) predictive relations between longitudinal acculturative stress trajectories in early adolescence and psychosocial adjustment outcomes in young adulthood. ^ Results from the multivariate analyses indicated great within group heterogeneity in acculturative stress among Latino youth during early adolescence, as well as significant continuity and discontinuity in the patterns of shifts among acculturative stress profiles between contiguous measurement occasions. Within each developmental period, membership in acculturative stress clusters was significantly and differentially associated with multiple adjustment outcomes, suggesting that maladaptive outcomes are more likely to occur among Latino adolescents experiencing high levels of psychological distress across multiple acculturative domains. In general, Latino youth acculturation is best understood as multi-dimensional, to be variable across time, and to be fluid and responsive to multiple factors and influences. Implications for preventive strategies are discussed with regard to acculturation and developmental psychology research literatures. ^
Resumo:
The development of 3G (the 3rd generation telecommunication) value-added services brings higher requirements of Quality of Service (QoS). Wideband Code Division Multiple Access (WCDMA) is one of three 3G standards, and enhancement of QoS for WCDMA Core Network (CN) becomes more and more important for users and carriers. The dissertation focuses on enhancement of QoS for WCDMA CN. The purpose is to realize the DiffServ (Differentiated Services) model of QoS for WCDMA CN. Based on the parallelism characteristic of Network Processors (NPs), the NP programming model is classified as Pool of Threads (POTs) and Hyper Task Chaining (HTC). In this study, an integrated programming model that combines both of the two models was designed. This model has highly efficient and flexible features, and also solves the problems of sharing conflicts and packet ordering. We used this model as the programming model to realize DiffServ QoS for WCDMA CN. ^ The realization mechanism of the DiffServ model mainly consists of buffer management, packet scheduling and packet classification algorithms based on NPs. First, we proposed an adaptive buffer management algorithm called Packet Adaptive Fair Dropping (PAFD), which takes into consideration of both fairness and throughput, and has smooth service curves. Then, an improved packet scheduling algorithm called Priority-based Weighted Fair Queuing (PWFQ) was introduced to ensure the fairness of packet scheduling and reduce queue time of data packets. At the same time, the delay and jitter are also maintained in a small range. Thirdly, a multi-dimensional packet classification algorithm called Classification Based on Network Processors (CBNPs) was designed. It effectively reduces the memory access and storage space, and provides less time and space complexity. ^ Lastly, an integrated hardware and software system of the DiffServ model of QoS for WCDMA CN was proposed. It was implemented on the NP IXP2400. According to the corresponding experiment results, the proposed system significantly enhanced QoS for WCDMA CN. It extensively improves consistent response time, display distortion and sound image synchronization, and thus increases network efficiency and saves network resource.^
Resumo:
Flow Cytometry analyzers have become trusted companions due to their ability to perform fast and accurate analyses of human blood. The aim of these analyses is to determine the possible existence of abnormalities in the blood that have been correlated with serious disease states, such as infectious mononucleosis, leukemia, and various cancers. Though these analyzers provide important feedback, it is always desired to improve the accuracy of the results. This is evidenced by the occurrences of misclassifications reported by some users of these devices. It is advantageous to provide a pattern interpretation framework that is able to provide better classification ability than is currently available. Toward this end, the purpose of this dissertation was to establish a feature extraction and pattern classification framework capable of providing improved accuracy for detecting specific hematological abnormalities in flow cytometric blood data. ^ This involved extracting a unique and powerful set of shift-invariant statistical features from the multi-dimensional flow cytometry data and then using these features as inputs to a pattern classification engine composed of an artificial neural network (ANN). The contribution of this method consisted of developing a descriptor matrix that can be used to reliably assess if a donor’s blood pattern exhibits a clinically abnormal level of variant lymphocytes, which are blood cells that are potentially indicative of disorders such as leukemia and infectious mononucleosis. ^ This study showed that the set of shift-and-rotation-invariant statistical features extracted from the eigensystem of the flow cytometric data pattern performs better than other commonly-used features in this type of disease detection, exhibiting an accuracy of 80.7%, a sensitivity of 72.3%, and a specificity of 89.2%. This performance represents a major improvement for this type of hematological classifier, which has historically been plagued by poor performance, with accuracies as low as 60% in some cases. This research ultimately shows that an improved feature space was developed that can deliver improved performance for the detection of variant lymphocytes in human blood, thus providing significant utility in the realm of suspect flagging algorithms for the detection of blood-related diseases.^
Resumo:
Anthropogenic habitat alterations and water-management practices have imposed an artificial spatial scale onto the once contiguous freshwater marshes of the Florida Everglades. To gain insight into how these changes may affect biotic communities, we examined whether variation in the abundance and community structure of large fishes (SL . 8 cm) in Everglades marshes varied more at regional or intraregional scales, and whether this variation was related to hydroperiod, water depth, floating mat volume, and vegetation density. From October 1997 to October 2002, we used an airboat electrofisher to sample large fishes at sites within three regions of the Everglades. Each of these regions is subject to unique watermanagement schedules. Dry-down events (water depth , 10 cm) occurred at several sites during spring in 1999, 2000, 2001, and 2002. The 2001 dry-down event was the most severe and widespread. Abundance of several fishes decreased significantly through time, and the number of days post-dry-down covaried significantly with abundance for several species. Processes operating at the regional scale appear to play important roles in regulating large fishes. The most pronounced patterns in abundance and community structure occurred at the regional scale, and the effect size for region was greater than the effect size for sites nested within region for abundance of all species combined, all predators combined, and each of the seven most abundant species. Non-metric multi-dimensional scaling revealed distinct groupings of sites corresponding to the three regions. We also found significant variation in community structure through time that correlated with the number of days post-dry-down. Our results suggest that hydroperiod and water management at the regional scale influence large fish communities of Everglades marshes.