20 resultados para (standard) interval arithmetic

em Helda - Digital Repository of University of Helsinki


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives. The sentence span task is a complex working memory span task used for estimating total working memory capacity for both processing (sentence comprehension) and storage (remembering a set of words). Several traditional models of working memory suggest that performance on these tasks relies on phonological short-term storage. However, long-term memory effects as well as the effects of expertise and strategies have challenged this view. This study uses a working memory task that aids the creation of retrieval structures in the form of stories, which have been shown to form integrated structures in longterm memory. The research question is whether sentence and story contexts boost memory performance in a complex working memory task. The hypothesis is that storage of the words in the task takes place in long-term memory. Evidence of this would be better recall for words as parts of sentences than for separate words, and, particularly, a beneficial effect for words as part of an organized story. Methods. Twenty stories consisting of five sentences each were constructed, and the stimuli in all experimental conditions were based on these sentences and sentence-final words, reordered and recombined for the other conditions. Participants read aloud sets of five sentences that either formed a story or not. In one condition they had to report all the last words at the end of the set, in another, they memorised an additional separate word with each sentence. The sentences were presented on the screen one word at a time (500 ms). After the presentation of each sentence, the participant verified a statement about the sentence. After five sentences, the participant repeated back the words in correct positions. Experiment 1 (n=16) used immediate recall, experiment 2 (n=21) both immediate recall and recall after a distraction interval (the operation span task). In experiment 2 a distracting mental arithmetic task was presented instead of recall in half of the trials, and an individual word was added before each sentence in the two experimental conditions when the participants were to memorize the sentence final words. Subjects also performed a listening span task (in exp.1) or an operation span task (exp.2) to allow comparison of the estimated span and performance in the story task. Results were analysed using correlations, repeated measures ANOVA and a chi-square goodness of fit test on the distribution of errors. Results and discussion. Both the relatedness of the sentences (the story condition) and the inclusion of the words into sentences helped memory. An interaction showed that the story condition had a greater effect on last words than separate words. The beneficial effect of the story was shown in all serial positions. The effects remained in delayed recall. When the sentences formed stories, performance in verification of the statements about sentence context was better. This, as well as the differing distributions of errors in different experimental conditions, suggest different levels of representation are in use in the different conditions. In the story condition, the nature of these representations could be in the form of an organized memory structure, a situation model. The other working memory tasks had only few week correlations to the story task. This could indicate that different processes are in use in the tasks. The results do not support short-term phonological storage, but instead are compatible with the words being encoded to LTM during the task.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of the present study was to assess oral health and treatment needs among adult Iranians according to socio-demographic status, smoking, and oral hygiene, and to investigate the relationships between these determinants and oral health. Data for 4448 young adult (aged 18) and 8301 middle-aged (aged 35 to 44) Iranians were collected in 2002 as part of a national survey using the World Health Organization (WHO) criteria for sampling and clinical diagnoses, across 28 provinces by 33 calibrated examiners. Gender, age, place of residence, and level of education served as socio-demographic information, smoking as behavioural and modified plaque index (PI) as the biological risk indicator for oral hygiene. Number of teeth, decayed teeth (DT), filled teeth (FT), decayed, missing, filled teeth (DMFT), community periodontal index (CPI), and prosthodontic rehabilitation served as outcome variables of oral health. Mean number of DMFT was 4.3 (Standard deviation (SD) = 3.7) in young adults and 11.0 (SD = 6.4) among middle-aged individuals. Among young adults the D-component (DT = 70%), and among middle-aged individuals the M-component (60%) dominated in the DMFT index. Among young adults, visible plaque was found in nearly all subjects. Maximum (max) PI was associated with higher mean number of DT, and higher periodontal treatment needs. A healthy periodontium was a rare condition, with 8% of young adults and 1% of middle-aged individuals having a max CPI = 0. The majority of the CPI findings among young adults consisted of calculus (48%) and deepened periodontal pockets (21%). Respective values for middle-aged individuals were 40% and 53%. Having a deep pocket (max CPI = 4) was more likely among young adults with a low level of education (Odds ratio (OR) = 2.7, 95% Confidence interval (CI) = 1.9–4.0) than it was among well-educated individuals. Among middle-aged individuals, having calculus or a periodontal pocket was more likely in men (OR = 1.8, 95% CI = 1.6–2.0) and in illiterate subjects (OR = 6.3, 95% CI = 5.1–7.8) than it was for their counterparts. Among young adults, having 28 teeth was more (p < 0.05) prevalent among men (72% vs. 68% for women), urban residents (71% vs. 67% for rural residents), and those with a high level of education (73% vs. 60% for those with a low level). Among middle-aged individuals, having a functional dentition was associated with younger age (OR = 2.0, 95% CI = 1.7−2.5) and higher level of education (OR = 1.8, 95% CI = 1.6−2.1). Of middle-aged individuals, 2% of 35- to 39-year-olds and 5% of those aged 40 to 44 were edentulous. Among the dentate subjects (n = 7,925), prosthodontic rehabilitation was more prevalent (p < 0.001) among women, urban residents, and those with a high level of education than it was among their counterparts. Among those having 1 to 19 teeth, a removable denture was the most common type of prosthodontic rehabilitation. Middle-aged individuals lacking a functional dentition were more likely (OR = 6.0, 95% CI = 4.8−7.6) to have prosthodontic rehabilitation than were those having a functional dentition. In total, 81% of all reported being non-smokers, and 32% of men and 5% of women were current smokers. Heavy smokers were the most likely to have deepened periodontal pockets (max CPI ≥ 3, OR = 2.9, 95% CI = 1.8−4.7) and to have less than 20 teeth (OR = 2.3, 95% CI = 1.5−3.6). The findings indicate impaired oral health status in adult Iranians, particularly those of low socio-economic status and educational level. The high prevalence of dental plaque and calculus and considerable unmet treatment needs call for a preventive population strategy with special emphasis on the improvement of oral self-care and smoking cessation to tackle the underlying risk factors for oral diseases in the Iranian adult population.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives: To evaluate the applicability of visual feedback posturography (VFP) for quantification of postural control, and to characterize the horizontal angular vestibulo-ocular reflex (AVOR) by use of a novel motorized head impulse test (MHIT). Methods: In VFP, subjects standing on a platform were instructed to move their center of gravity to symmetrically placed peripheral targets as fast and accurately as possible. The active postural control movements were measured in healthy subjects (n = 23), and in patients with vestibular schwannoma (VS) before surgery (n = 49), one month (n = 17), and three months (n = 36) after surgery. In MHIT we recorded head and eye position during motorized head impulses (mean velocity of 170º/s and acceleration of 1 550º/s²) in healthy subjects (n = 22), in patients with VS before surgery (n = 38) and about four months afterwards (n = 27). The gain, asymmetry and latency in MHIT were calculated. Results: The intraclass correlation coefficient for VFP parameters during repeated tests was significant (r = 0.78-0.96; p < 0.01), although two of four VFP parameters improved slightly during five test sessions in controls. At least one VFP parameter was abnormal pre- and postoperatively in almost half the patients, and these abnormal preoperative VFP results correlated significantly with abnormal postoperative results. The mean accuracy in postural control in patients was reduced pre- and postoperatively. A significant side difference with VFP was evident in 10% of patients. In the MHIT, the normal gain was close to unity, the asymmetry in gain was within 10%, and the latency was a mean ± standard deviation 3.4 ± 6.3 milliseconds. Ipsilateral gain or asymmetry in gain was preoperatively abnormal in 71% of patients, whereas it was abnormal in every patient after surgery. Preoperative gain (mean ± 95% confidence interval) was significantly lowered to 0.83 ± 0.08 on the ipsilateral side compared to 0.98 ± 0.06 on the contralateral side. The ipsilateral postoperative mean gain of 0.53 ± 0.05 was significantly different from preoperative gain. Conclusion: The VFP is a repeatable, quantitative method to assess active postural control within individual subjects. The mean postural control in patients with VS was disturbed before and after surgery, although not severely. Side difference in postural control in the VFP was rare. The horizontal AVOR results in healthy subjects and in patients with VS, measured with MHIT, were in agreement with published data achieved using other techniques with head impulse stimuli. The MHIT is a non-invasive method which allows reliable clinical assessment of the horizontal AVOR.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The autonomic nervous system is an important modulator of ventricular repolarization and arrhythmia vulnerability. This study explored the effects of cardiovascular autonomic function tests on repolarization and its heterogeneity, with a special reference to congenital arrhythmogenic disorders typically associated with stress-induced fatal ventricular arrhythmias. The first part explored the effects of standardized autonomic tests on QT intervals in a 12-lead electrocardiogram and in multichannel magnetocardiography in 10 healthy adults. The second part studied the effects of deep breathing, Valsalva manouvre, mental stress, sustained handgrip and mild exercise on QT intervals in asymptomatic patients with LQT1 subtype of the hereditary long QT syndrome (n=9) and in patients with arrhythmogenic right ventricular dysplasia (ARVD, n=9). Even strong sympathetic activation had no effects on spatial QT interval dispersion in healthy subjects, but deep respiratory efforts and Valsalva influenced it in ways that were opposite in electrocardiographic and magnetocardiographic recordings. LQT1 patients showed blunted QT interval and sinus nodal responses to sympathetic challenge, as well as an exaggerated QT prolongation during the recovery phases. LQT1 patients showed a QT interval recovery overshoot in 2.4 ± 1.7 tests compared with 0.8 ± 0.7 in healthy controls (P = 0.02). Valsalva strain prolonged the T wave peak to T wave end interval only in the LQT1 patients, considered to reflect the arrhythmogenic substrate in this syndrome. ARVD patients showed signs of abnormal repolarization in the right ventricle, modulated by abrupt sympathetic activation. An electrocardiographic marker reflecting interventricular dispersion of repolarization was introduced. It showed that LQT1 patients exhibit a repolarization gradient from the left ventricle towards the right ventricle, significantly larger than in controls. In contrast, ARVD patients showed a repolarization gradient from the right ventricle towards the left. Valsalva strain amplified the repolarization gradient in LQT1 patients whereas it transiently reversed it in patients with ARVD. In conclusion, intrathoracic volume and pressure changes influence regional electrocardiographic and magnetocardiographic QT interval measurements differently. Especially recovery phases of standard cardiovascular autonomic functions tests and Valsalva manoeuvre reveal the abnormal repolarization in asymptomatic LQT1 patients. Both LQT1 and ARVD patients have abnormal interventricular repolarization gradients, modulated by abrupt sympathetic activation. Autonomic testing and in particular the Valsalva manoeuvre are potentially useful in unmasking abnormal repolarization in these syndromes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

"We report on a search for the standard-model Higgs boson in pp collisions at s=1.96 TeV using an integrated luminosity of 2.0 fb(-1). We look for production of the Higgs boson decaying to a pair of bottom quarks in association with a vector boson V (W or Z) decaying to quarks, resulting in a four-jet final state. Two of the jets are required to have secondary vertices consistent with B-hadron decays. We set the first 95% confidence level upper limit on the VH production cross section with V(-> qq/qq('))H(-> bb) decay for Higgs boson masses of 100-150 GeV/c(2) using data from run II at the Fermilab Tevatron. For m(H)=120 GeV/c(2), we exclude cross sections larger than 38 times the standard-model prediction."

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We combine searches by the CDF and D0 collaborations for a Higgs boson decaying to W+W-. The data correspond to an integrated total luminosity of 4.8 (CDF) and 5.4 (D0) fb-1 of p-pbar collisions at sqrt{s}=1.96 TeV at the Fermilab Tevatron collider. No excess is observed above background expectation, and resulting limits on Higgs boson production exclude a standard-model Higgs boson in the mass range 162-166 GeV at the 95% C.L.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a search for standard model (SM) Higgs boson production using ppbar collision data at sqrt(s) = 1.96 TeV, collected with the CDF II detector and corresponding to an integrated luminosity of 4.8 fb-1. We search for Higgs bosons produced in all processes with a significant production rate and decaying to two W bosons. We find no evidence for SM Higgs boson production and place upper limits at the 95% confidence level on the SM production cross section (sigma(H)) for values of the Higgs boson mass (m_H) in the range from 110 to 200 GeV. These limits are the most stringent for m_H > 130 GeV and are 1.29 above the predicted value of sigma(H) for mH = 165 GeV.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

"We report on a search for the standard-model Higgs boson in pp collisions at s=1.96 TeV using an integrated luminosity of 2.0 fb(-1). We look for production of the Higgs boson decaying to a pair of bottom quarks in association with a vector boson V (W or Z) decaying to quarks, resulting in a four-jet final state. Two of the jets are required to have secondary vertices consistent with B-hadron decays. We set the first 95% confidence level upper limit on the VH production cross section with V(-> qq/qq('))H(-> bb) decay for Higgs boson masses of 100-150 GeV/c(2) using data from run II at the Fermilab Tevatron. For m(H)=120 GeV/c(2), we exclude cross sections larger than 38 times the standard-model prediction."

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a search for standard model Higgs boson production in association with a W boson in proton-antiproton collisions at a center of mass energy of 1.96 TeV. The search employs data collected with the CDF II detector that correspond to an integrated luminosity of approximately 1.9 inverse fb. We select events consistent with a signature of a single charged lepton, missing transverse energy, and two jets. Jets corresponding to bottom quarks are identified with a secondary vertex tagging method, a jet probability tagging method, and a neural network filter. We use kinematic information in an artificial neural network to improve discrimination between signal and background compared to previous analyses. The observed number of events and the neural network output distributions are consistent with the standard model background expectations, and we set 95% confidence level upper limits on the production cross section times branching fraction ranging from 1.2 to 1.1 pb or 7.5 to 102 times the standard model expectation for Higgs boson masses from 110 to $150 GeV/c^2, respectively.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In a search for new phenomena in a signature suppressed in the standard model of elementary particles (SM), we compare the inclusive production of events containing a lepton, a photon, significant transverse momentum imbalance (MET), and a jet identified as containing a b-quark, to SM predictions. The search uses data produced in proton-antiproton collisions at 1.96 TeV corresponding to 1.9 fb-1 of integrated luminosity taken with the CDF detector at the Fermilab Tevatron. We find 28 lepton+photon+MET+b events versus an expectation of 31.0+4.1/-3.5 events. If we further require events to contain at least three jets and large total transverse energy, simulations predict that the largest SM source is top-quark pair production with an additional radiated photon, ttbar+photon. In the data we observe 16 ttbar+photon candidate events versus an expectation from SM sources of 11.2+2.3/-2.1. Assuming the difference between the observed number and the predicted non-top-quark total is due to SM top quark production, we estimate the ttg cross section to be 0.15 +- 0.08 pb.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Layering is a widely used method for structuring data in CAD-models. During the last few years national standardisation organisations, professional associations, user groups for particular CAD-systems, individual companies etc. have issued numerous standards and guidelines for the naming and structuring of layers in building design. In order to increase the integration of CAD data in the industry as a whole ISO recently decided to define an international standard for layer usage. The resulting standard proposal, ISO 13567, is a rather complex framework standard which strives to be more of a union than the least common denominator of the capabilities of existing guidelines. A number of principles have been followed in the design of the proposal. The first one is the separation of the conceptual organisation of information (semantics) from the way this information is coded (syntax). The second one is orthogonality - the fact that many ways of classifying information are independent of each other and can be applied in combinations. The third overriding principle is the reuse of existing national or international standards whenever appropriate. The fourth principle allows users to apply well-defined subsets of the overall superset of possible layernames. This article describes the semantic organisation of the standard proposal as well as its default syntax. Important information categories deal with the party responsible for the information, the type of building element shown, whether a layer contains the direct graphical description of a building part or additional information needed in an output drawing etc. Non-mandatory information categories facilitate the structuring of information in rebuilding projects, use of layers for spatial grouping in large multi-storey projects, and storing multiple representations intended for different drawing scales in the same model. Pilot testing of ISO 13567 is currently being carried out in a number of countries which have been involved in the definition of the standard. In the article two implementations, which have been carried out independently in Sweden and Finland, are described. The article concludes with a discussion of the benefits and possible drawbacks of the standard. Incremental development within the industry, (where ”best practice” can become ”common practice” via a standard such as ISO 13567), is contrasted with the more idealistic scenario of building product models. The relationship between CAD-layering, document management product modelling and building element classification is also discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

After Gödel's incompleteness theorems and the collapse of Hilbert's programme Gerhard Gentzen continued the quest for consistency proofs of Peano arithmetic. He considered a finitistic or constructive proof still possible and necessary for the foundations of mathematics. For a proof to be meaningful, the principles relied on should be considered more reliable than the doubtful elements of the theory concerned. He worked out a total of four proofs between 1934 and 1939. This thesis examines the consistency proofs for arithmetic by Gentzen from different angles. The consistency of Heyting arithmetic is shown both in a sequent calculus notation and in natural deduction. The former proof includes a cut elimination theorem for the calculus and a syntactical study of the purely arithmetical part of the system. The latter consistency proof in standard natural deduction has been an open problem since the publication of Gentzen's proofs. The solution to this problem for an intuitionistic calculus is based on a normalization proof by Howard. The proof is performed in the manner of Gentzen, by giving a reduction procedure for derivations of falsity. In contrast to Gentzen's proof, the procedure contains a vector assignment. The reduction reduces the first component of the vector and this component can be interpreted as an ordinal less than epsilon_0, thus ordering the derivations by complexity and proving termination of the process.