733 resultados para Segmented HPGe
Resumo:
The present paper suggests articulating the general context of workplace in information literacy research. The paper considers distinguishing between information literacy research in workplaces and professions. Referring to the results of a phenomenographic enquiry into web professionals’ information literacy as an example, it is indicated that work-related information literacy in particular contexts and depending on the nature of the context, is experienced beyond physical workspaces and at professional level. This involves people interacting with each other and with information at a broader level in comparison to a physically bounded workspace. Regarding the example case discussed in the paper, virtuality is identified as the dominant feature of the profession that causes information literacy to be experienced at a professional level. It is anticipated that pursuing the direction proposed in the paper will result in a more segmented image of work-related information literacy.
Resumo:
A divide-and-correct algorithm is described for multiple-precision division in the negative base number system. In this algorithm an initial quotient estimate is obtained from suitable segmented operands; this is then corrected by simple rules to arrive at the true quotient.
Resumo:
The analysis of sequential data is required in many diverse areas such as telecommunications, stock market analysis, and bioinformatics. A basic problem related to the analysis of sequential data is the sequence segmentation problem. A sequence segmentation is a partition of the sequence into a number of non-overlapping segments that cover all data points, such that each segment is as homogeneous as possible. This problem can be solved optimally using a standard dynamic programming algorithm. In the first part of the thesis, we present a new approximation algorithm for the sequence segmentation problem. This algorithm has smaller running time than the optimal dynamic programming algorithm, while it has bounded approximation ratio. The basic idea is to divide the input sequence into subsequences, solve the problem optimally in each subsequence, and then appropriately combine the solutions to the subproblems into one final solution. In the second part of the thesis, we study alternative segmentation models that are devised to better fit the data. More specifically, we focus on clustered segmentations and segmentations with rearrangements. While in the standard segmentation of a multidimensional sequence all dimensions share the same segment boundaries, in a clustered segmentation the multidimensional sequence is segmented in such a way that dimensions are allowed to form clusters. Each cluster of dimensions is then segmented separately. We formally define the problem of clustered segmentations and we experimentally show that segmenting sequences using this segmentation model, leads to solutions with smaller error for the same model cost. Segmentation with rearrangements is a novel variation to the segmentation problem: in addition to partitioning the sequence we also seek to apply a limited amount of reordering, so that the overall representation error is minimized. We formulate the problem of segmentation with rearrangements and we show that it is an NP-hard problem to solve or even to approximate. We devise effective algorithms for the proposed problem, combining ideas from dynamic programming and outlier detection algorithms in sequences. In the final part of the thesis, we discuss the problem of aggregating results of segmentation algorithms on the same set of data points. In this case, we are interested in producing a partitioning of the data that agrees as much as possible with the input partitions. We show that this problem can be solved optimally in polynomial time using dynamic programming. Furthermore, we show that not all data points are candidates for segment boundaries in the optimal solution.
Resumo:
Online content services can greatly benefit from personalisation features that enable delivery of content that is suited to each user's specific interests. This thesis presents a system that applies text analysis and user modeling techniques in an online news service for the purpose of personalisation and user interest analysis. The system creates a detailed thematic profile for each content item and observes user's actions towards content items to learn user's preferences. A handcrafted taxonomy of concepts, or ontology, is used in profile formation to extract relevant concepts from the text. User preference learning is automatic and there is no need for explicit preference settings or ratings from the user. Learned user profiles are segmented into interest groups using clustering techniques with the objective of providing a source of information for the service provider. Some theoretical background for chosen techniques is presented while the main focus is in finding practical solutions to some of the current information needs, which are not optimally served with traditional techniques.
Resumo:
Background: A paradigm shift in educational policy to create problem solvers and critical thinkers produced the games concept approach (GCA) in Singapore's Revised Syllabus for Physical Education (1999). A pilot study (2001) conducted on 11 primary school student teachers (STs) using this approach identified time management and questioning as two of the major challenges faced by novice teachers. Purpose: To examine the GCA from three perspectives: structure—lesson form in terms of teacher-time and pupil-time; product—how STs used those time fractions; and process—the nature of their questioning (type, timing, and target). Participants and setting: Forty-nine STs from three different PETE cohorts (two-year diploma, four-year degree, two-year post-graduate diploma) volunteered to participate in the study conducted during the penultimate week of their final practicum in public primary and secondary schools. Intervention: Based on the findings of the pilot study, PETE increased the emphasis on GCA content specific knowledge and pedagogical procedures. To further support STs learning to actualise the GCA, authentic micro-teaching experiences that were closely monitored by faculty were provided in schools nearby. Research design: This is a descriptive study of time-management and questioning strategies implemented by STs on practicum. Each lesson was segmented into a number of sub-categories of teacher-time (organisation, demonstration and closure) and pupil-time (practice time and game time). Questions were categorised as knowledge, technical, tactical or affective. Data collection: Each ST was video-taped teaching a GCA lesson towards the end of their final practicum. The STs individually determined the timing of the data collection and the lesson to be observed. Data analysis: Each lesson was segmented into a number of sub-categories of both teacher- and pupil-time. Duration recording using Noldus software (Observer 4.0) segmented the time management of different lesson components. Questioning was coded in terms of type, timing and target. Separate MANOVAs were used to measure the difference between programmes and levels (primary and secondary) in relation to time-management procedures and questioning strategies. Findings: No differences emerged between the programmes or levels in their time-management or questioning strategies. Using the GCA, STs generated more pupil time (53%) than teacher time (47%). STs at the primary level provided more technical practice, and those in secondary schools more small-sided game play. Most questions (58%) were asked during play or practice but were substantially low-order involving knowledge or recall (76%) and only 6.7% were open-ended or divergent and capable of developing tactical awareness. Conclusions: Although STs are delivering more pupil time (practice and game) than teacher-time, the lesson structure requires further fine-tuning to extend the practice task beyond technical drills. Many questions are being asked to generate knowledge about games but lack sufficient quality to enhance critical thinking and tactical awareness, as the GCA intends.
Resumo:
PURPOSE To study the utility of fractional calculus in modeling gradient-recalled echo MRI signal decay in the normal human brain. METHODS We solved analytically the extended time-fractional Bloch equations resulting in five model parameters, namely, the amplitude, relaxation rate, order of the time-fractional derivative, frequency shift, and constant offset. Voxel-level temporal fitting of the MRI signal was performed using the classical monoexponential model, a previously developed anomalous relaxation model, and using our extended time-fractional relaxation model. Nine brain regions segmented from multiple echo gradient-recalled echo 7 Tesla MRI data acquired from five participants were then used to investigate the characteristics of the extended time-fractional model parameters. RESULTS We found that the extended time-fractional model is able to fit the experimental data with smaller mean squared error than the classical monoexponential relaxation model and the anomalous relaxation model, which do not account for frequency shift. CONCLUSIONS We were able to fit multiple echo time MRI data with high accuracy using the developed model. Parameters of the model likely capture information on microstructural and susceptibility-induced changes in the human brain.
Resumo:
By using the algebraic locus of the coupler curve of a PRRP planar linkage, in this paper, a kinematic theory is developed for planar, radially foldable closed-loop linkages. This theory helps derive the previously invented building blocks, which consist of only two inter-connected angulated elements, for planar foldable structures. Furthermore, a special case of a circumferentially actuatable foldable linkage (which is different from the previously known cases) is derived from the theory, A quantitative description of some known and some new properties of planar foldable linkages, including the extent of foldability, shape-preservation of the interior polygons, multi-segmented assemblies and heterogeneous circumferential arrangemants, is also presented. The design equations derived here make the conception of even complex planar radially foldable linkages systematic and straightforward. Representative examples are presented to illustrate the usage of the design equations and the construction of prototypes. The current limitations and some possible extensions of the theory are also noted. (c) 2007, Elsevier Ltd. All ri-hts reserved.
Resumo:
EEG recordings are often contaminated with ocular artifacts such as eye blinks and eye movements. These artifacts may obscure underlying brain activity in the electroencephalogram (EEG) data and make the analysis of the data difficult. In this paper, we explore the use of empirical mode decomposition (EMD) based filtering technique to correct the eye blinks and eye movementartifacts in single channel EEG data. In this method, the single channel EEG data containing ocular artifact is segmented such that the artifact in each of the segment is considered as some type of slowly varying trend in the dataand the EMD is used to remove the trend. The filtering is done using partial reconstruction from components of the decomposition. The method is completely data dependent and hence adaptive and nonlinear. Experimental results are provided to check the applicability of the method on real EEG data and the results are quantified using power spectral density (PSD) as a measure. The method has given fairlygood results and does not make use of any preknowledge of artifacts or the EEG data used.
Resumo:
ErbB3 binding protein Ebp1 has been shown to downregulate ErbB3 receptor-mediated signaling to inhibit cell proliferation. Rinderpest virus belongs to the family Paramyxoviridae and is characterized by the presence of a non-segmented negative-sense RNA genome. In this work, we show that rinderpest virus infection of Vero cells leads to the down-regulation of the host factor Ebp1, at both the mRNA and protein levels. Ebp1 protein has been shown to co-localize with viral inclusion bodies in infected cells, and it is packaged into virions, presumably through its interaction with the N protein or the N-RNA itself. Overexpression of Ebp1 inhibits viral transcription and multiplication in infected cells, suggesting that a mutual antagonism operates between host factor Ebp1 and the virus.
Resumo:
This work examines the concept of citizenship of TH Marshall and the societal community concept of Talcott Parsons. I am especially interested in whether Marshall s concept of citizenship or Parsons s concept of societal community enable to develop such an analytical framework that creates a basis for relevant examination of how the mechanisms that include or exclude citizenship into the society constitute. The focus is in societal heterogeneity, which will easily introduce multicultural issues in the form of diversity-based conflicts in values, norms and identities. The focus of the review is in the religious orientation and in the examination of the backgrounds of ethnic groups. The research method is the thorough examination of the texts and commenting of the literature of TH Marshall and Talcott Parson, based on which I build my own argumentation and interpretation. As research findings I propose that especially the late works of Talcott Parsons offer analytical tools to study societal pluralism in a way that gives fruitful basis also to the thinking of the 21st century researchers. Parsons s analytical frames of reference form relevant starting points in relation with the social analyses that are made based on inclusion and exclusion. Parsons describes the societal community as differentiated and segmented network, in which different customs and operation models are accepted. Cultural understanding differentiates how and in which context these will be applied. In the conditions of open systems culture can, however, not operate as a connector of the variations of actors neither as a common code that fades away conflicts. Parsons s thinking opens a view into the multicultural world, which is a world society and which consists of ethnic groups that are not internally monolithic but instead in a status of constant cultural redefinition. Individuals and groups are differentiated based on sex, age, different capacities, place of residence, belonging into different collectivities, etc. The late works of Talcott Parsons provide a realistic and an effective, theoretical framework for research of citizenship problems in multicultural conditions. Keywords: citizenship, societal community, society, community, religion, ethnic background, inclusion, exclusion, values and norms.
Resumo:
Hantaviruses, members of the genus Hantavirus in the Bunyaviridae family, are enveloped single-stranded RNA viruses with tri-segmented genome of negative polarity. In humans, hantaviruses cause two diseases, hemorrhagic fever with renal syndrome (HFRS) and hantavirus pulmonary syndrome (HPS), which vary in severity depending on the causative agent. Each hantavirus is carried by a specific rodent host and is transmitted to humans through excreta of infected rodents. The genome of hantaviruses encodes four structural proteins: the nucleocapsid protein (N), the glycoproteins (Gn and Gc), and the polymerase (L) and also the nonstructural protein (NSs). This thesis deals with the functional characterization of hantavirus N protein with regard to its structure. Structural studies of the N protein have progressed slowly and the crystal structure of the whole protein is still not available, therefore biochemical assays coupled with bioinformatical modeling proved essential for studying N protein structure and functions. Presumably, during RNA encapsidation, the N protein first forms intermediate trimers and then oligomers. First, we investigated the role of N-terminal domain in the N protein oligomerization. The results suggested that the N-terminal region of the N protein forms a coiled-coil, in which two antiparallel alpha helices interact via their hydrophobic seams. Hydrophobic residues L4, I11, L18, L25 and V32 in the first helix and L44, V51, L58 and L65 in the second helix were crucial for stabilizing the structure. The results were consistent with the head-to-head, tail-to-tail model for hantavirus N protein trimerization. We demonstrated that an intact coiled-coil structure of the N terminus is crucial for the oligomerization capacity of the N protein. We also added new details to the head-to-head, tail-to-tail model of trimerization by suggesting that the initial step is based on interaction(s) between intact intra-molecular coiled-coils of the monomers. We further analyzed the importance of charged aa residues located within the coiled-coil for the N protein oligomerization. To predict the interacting surfaces of the monomers we used an upgraded in silico model of the coiled-coil domain that was docked into a trimer. Next the predicted target residues were mutated. The results obtained using the mammalian two-hybrid assay suggested that conserved charged aa residues within the coiled-coil make a substantial contribution to the N protein oligomerization. This contribution probably involves the formation of interacting surfaces of the N monomers and also stabilization of the coiled-coil via intramolecular ionic bridging. We proposed that the tips of the coiled-coils are the first to come into direct contact and thus initiate tight packing of the three monomers into a compact structure. This was in agreement with the previous results showing that an increase in ionic strength abolished the interaction between N protein molecules. We also showed that residues having the strongest effect on the N protein oligomerization are not scattered randomly throughout the coiled-coil 3D model structure, but form clusters. Next we found evidence for the hantaviral N protein interaction with the cytoplasmic tail of the glycoprotein Gn. In order to study this interaction we used the GST pull-down assay in combination with mutagenesis technique. The results demonstrated that intact, properly folded zinc fingers of the Gn protein cytoplasmic tail as well as the middle domain of the N protein (that includes aa residues 80 248 and supposedly carries the RNA-binding domain) are essential for the interaction. Since hantaviruses do not have a matrix protein that mediates the packaging of the viral RNA in other negatve stranded viruses (NSRV), hantaviral RNPs should be involved in a direct interaction with the intraviral domains of the envelope-embedded glycoproteins. By showing the N-Gn interaction we provided the evidence for one of the crucial steps in the virus replication at which RNPs are directed to the site of the virus assembly. Finally we started analysis of the N protein RNA-binding region, which is supposedly located in the middle domain of the N protein molecule. We developed a model for the initial step of RNA-binding by the hantaviral N protein. We hypothesized that the hantaviral N protein possesses two secondary structure elements that initiate the RNA encapsidation. The results suggest that amino acid residues (172-176) presumably act as a hook to catch vRNA and that the positively charged interaction surface (aa residues 144-160) enhances the initial N-RNA interacation. In conclusion, we elucidated new functions of hantavirus N protein. Using in silico modeling we predicted the domain structure of the protein and using experimental techniques showed that each domain is responsible for executing certain function(s). We showed that intact N terminal coiled-coil domain is crucial for oligomerization and charged residues located on its surface form a interaction surface for the N monomers. The middle domain is essential for interaction with the cytoplasmic tail of the Gn protein and RNA binding.
Resumo:
This paper investigates to what extent the volatility of Finnish stock portfolios is transmitted through the "world volatility". We operationalize the volatility processes of Finnish leverage, industry, and size portfolio returns by asymmetric GARCH specifications according to Glosten et al. (1993). We use daily return data for January, 2, 1987 to December 30, 1998. We find that the world shock significantly enters the domestic models, and that the impact has increased over time. This applies also for the variance ratios, and the correlations to the world. The larger the firm, the larger is the world impact. The conditional variance is higher during recessions. The asymmetry parameter is surprisingly non-significant, and the leverage hypothesis cannot be verified. The return generating process of the domestic portfolio returns does usually not include the world information set, thus indicating that the returns are generated by a segmented conditional asset pricing model.
Resumo:
This paper describes a method of automated segmentation of speech assuming the signal is continuously time varying rather than the traditional short time stationary model. It has been shown that this representation gives comparable if not marginally better results than the other techniques for automated segmentation. A formulation of the 'Bach' (music semitonal) frequency scale filter-bank is proposed. A comparative study has been made of the performances using Mel, Bark and Bach scale filter banks considering this model. The preliminary results show up to 80 % matches within 20 ms of the manually segmented data, without any information of the content of the text and without any language dependence. 'Bach' filters are seen to marginally outperform the other filters.
Resumo:
This correspondence describes a method for automated segmentation of speech. The method proposed in this paper uses a specially designed filter-bank called Bach filter-bank which makes use of 'music' related perception criteria. The speech signal is treated as continuously time varying signal as against a short time stationary model. A comparative study has been made of the performances using Mel, Bark and Bach scale filter banks. The preliminary results show up to 80 % matches within 20 ms of the manually segmented data, without any information of the content of the text and without any language dependence. The Bach filters are seen to marginally outperform the other filters.
Resumo:
The research analyzes product quality from a customer perspective in the case of the wood products industry. Of specific interest is to understand better how environmental quality is perceived from a customer perspective. The empirical material used comprises four data-sets from Finland, Germany and the UK, collected during 1992 2004. The methods consist of a set of quantitative statistical analyses. The results indicate that perceived quality from a customer perspective can be presented using a multidimensional and hierarchical construct with tangible and intangible dimensions, that is common to different markets and products. This applies in the case of wood products but also more generally at least for some other construction materials. For wood products, tangible product quality has two main sub-dimensions: technical quality and appearance. For product intangibles, a few main quality dimensions seem be detectable: Quality of intangibles related to the physical product, such as environmental issues and product-related information, supplier-related characteristics, and service and sales personnel behavior. Environmental quality and information are often perceived as being inter-related. Technical performance and appearance are the most important considerations for customers in the case of wood products. Organizational customers in particular also clearly consider certain intangible quality dimensions to be important, such as service and supplier reliability. The high technical quality may be considered as a license to operate , but product appearance and intangible quality provide potential for differentiation for attracting certain market segments. Intangible quality issues are those where Nordic suppliers underperform in comparison to their Central-European competitors on the important German markets. Environmental quality may not have been used to its full extent to attract customers. One possibility is to increase the availability of the environment-related information, or to develop environment-related product characteristics to also provide some individual benefits. Information technology provides clear potential to facilitate information-based quality improvements, which was clearly recognized by Finnish forest industry already in the early 1990s. The results indeed indicate that wood products markets are segmented with regard to quality demands