884 resultados para Popularity.
Resumo:
Due to huge popularity of portable terminals based on Wireless LANs and increasing demand for multimedia services from these terminals, the earlier structures and protocols are insufficient to cover the requirements of emerging networks and communications. Most research in this field is tailored to find more efficient ways to optimize the quality of wireless LAN regarding the requirements of multimedia services. Our work is to investigate the effects of modulation modes at the physical layer, retry limits at the MAC layer and packet sizes at the application layer over the quality of media packet transmission. Interrelation among these parameters to extract a cross-layer idea will be discussed as well. We will show how these parameters from different layers jointly contribute to the performance of service delivery by the network. The results obtained could form a basis to suggest independent optimization in each layer (an adaptive approach) or optimization of a set of parameters from different layers (a cross-layer approach). Our simulation model is implemented in the NS-2 simulator. Throughput and delay (latency) of packet transmission are the quantities of our assessments. © 2010 IEEE.
Resumo:
X-ray computed tomography (CT) imaging constitutes one of the most widely used diagnostic tools in radiology today with nearly 85 million CT examinations performed in the U.S in 2011. CT imparts a relatively high amount of radiation dose to the patient compared to other x-ray imaging modalities and as a result of this fact, coupled with its popularity, CT is currently the single largest source of medical radiation exposure to the U.S. population. For this reason, there is a critical need to optimize CT examinations such that the dose is minimized while the quality of the CT images is not degraded. This optimization can be difficult to achieve due to the relationship between dose and image quality. All things being held equal, reducing the dose degrades image quality and can impact the diagnostic value of the CT examination.
A recent push from the medical and scientific community towards using lower doses has spawned new dose reduction technologies such as automatic exposure control (i.e., tube current modulation) and iterative reconstruction algorithms. In theory, these technologies could allow for scanning at reduced doses while maintaining the image quality of the exam at an acceptable level. Therefore, there is a scientific need to establish the dose reduction potential of these new technologies in an objective and rigorous manner. Establishing these dose reduction potentials requires precise and clinically relevant metrics of CT image quality, as well as practical and efficient methodologies to measure such metrics on real CT systems. The currently established methodologies for assessing CT image quality are not appropriate to assess modern CT scanners that have implemented those aforementioned dose reduction technologies.
Thus the purpose of this doctoral project was to develop, assess, and implement new phantoms, image quality metrics, analysis techniques, and modeling tools that are appropriate for image quality assessment of modern clinical CT systems. The project developed image quality assessment methods in the context of three distinct paradigms, (a) uniform phantoms, (b) textured phantoms, and (c) clinical images.
The work in this dissertation used the “task-based” definition of image quality. That is, image quality was broadly defined as the effectiveness by which an image can be used for its intended task. Under this definition, any assessment of image quality requires three components: (1) A well defined imaging task (e.g., detection of subtle lesions), (2) an “observer” to perform the task (e.g., a radiologists or a detection algorithm), and (3) a way to measure the observer’s performance in completing the task at hand (e.g., detection sensitivity/specificity).
First, this task-based image quality paradigm was implemented using a novel multi-sized phantom platform (with uniform background) developed specifically to assess modern CT systems (Mercury Phantom, v3.0, Duke University). A comprehensive evaluation was performed on a state-of-the-art CT system (SOMATOM Definition Force, Siemens Healthcare) in terms of noise, resolution, and detectability as a function of patient size, dose, tube energy (i.e., kVp), automatic exposure control, and reconstruction algorithm (i.e., Filtered Back-Projection– FPB vs Advanced Modeled Iterative Reconstruction– ADMIRE). A mathematical observer model (i.e., computer detection algorithm) was implemented and used as the basis of image quality comparisons. It was found that image quality increased with increasing dose and decreasing phantom size. The CT system exhibited nonlinear noise and resolution properties, especially at very low-doses, large phantom sizes, and for low-contrast objects. Objective image quality metrics generally increased with increasing dose and ADMIRE strength, and with decreasing phantom size. The ADMIRE algorithm could offer comparable image quality at reduced doses or improved image quality at the same dose (increase in detectability index by up to 163% depending on iterative strength). The use of automatic exposure control resulted in more consistent image quality with changing phantom size.
Based on those results, the dose reduction potential of ADMIRE was further assessed specifically for the task of detecting small (<=6 mm) low-contrast (<=20 HU) lesions. A new low-contrast detectability phantom (with uniform background) was designed and fabricated using a multi-material 3D printer. The phantom was imaged at multiple dose levels and images were reconstructed with FBP and ADMIRE. Human perception experiments were performed to measure the detection accuracy from FBP and ADMIRE images. It was found that ADMIRE had equivalent performance to FBP at 56% less dose.
Using the same image data as the previous study, a number of different mathematical observer models were implemented to assess which models would result in image quality metrics that best correlated with human detection performance. The models included naïve simple metrics of image quality such as contrast-to-noise ratio (CNR) and more sophisticated observer models such as the non-prewhitening matched filter observer model family and the channelized Hotelling observer model family. It was found that non-prewhitening matched filter observers and the channelized Hotelling observers both correlated strongly with human performance. Conversely, CNR was found to not correlate strongly with human performance, especially when comparing different reconstruction algorithms.
The uniform background phantoms used in the previous studies provided a good first-order approximation of image quality. However, due to their simplicity and due to the complexity of iterative reconstruction algorithms, it is possible that such phantoms are not fully adequate to assess the clinical impact of iterative algorithms because patient images obviously do not have smooth uniform backgrounds. To test this hypothesis, two textured phantoms (classified as gross texture and fine texture) and a uniform phantom of similar size were built and imaged on a SOMATOM Flash scanner (Siemens Healthcare). Images were reconstructed using FBP and a Sinogram Affirmed Iterative Reconstruction (SAFIRE). Using an image subtraction technique, quantum noise was measured in all images of each phantom. It was found that in FBP, the noise was independent of the background (textured vs uniform). However, for SAFIRE, noise increased by up to 44% in the textured phantoms compared to the uniform phantom. As a result, the noise reduction from SAFIRE was found to be up to 66% in the uniform phantom but as low as 29% in the textured phantoms. Based on this result, it clear that further investigation was needed into to understand the impact that background texture has on image quality when iterative reconstruction algorithms are used.
To further investigate this phenomenon with more realistic textures, two anthropomorphic textured phantoms were designed to mimic lung vasculature and fatty soft tissue texture. The phantoms (along with a corresponding uniform phantom) were fabricated with a multi-material 3D printer and imaged on the SOMATOM Flash scanner. Scans were repeated a total of 50 times in order to get ensemble statistics of the noise. A novel method of estimating the noise power spectrum (NPS) from irregularly shaped ROIs was developed. It was found that SAFIRE images had highly locally non-stationary noise patterns with pixels near edges having higher noise than pixels in more uniform regions. Compared to FBP, SAFIRE images had 60% less noise on average in uniform regions for edge pixels, noise was between 20% higher and 40% lower. The noise texture (i.e., NPS) was also highly dependent on the background texture for SAFIRE. Therefore, it was concluded that quantum noise properties in the uniform phantoms are not representative of those in patients for iterative reconstruction algorithms and texture should be considered when assessing image quality of iterative algorithms.
The move beyond just assessing noise properties in textured phantoms towards assessing detectability, a series of new phantoms were designed specifically to measure low-contrast detectability in the presence of background texture. The textures used were optimized to match the texture in the liver regions actual patient CT images using a genetic algorithm. The so called “Clustured Lumpy Background” texture synthesis framework was used to generate the modeled texture. Three textured phantoms and a corresponding uniform phantom were fabricated with a multi-material 3D printer and imaged on the SOMATOM Flash scanner. Images were reconstructed with FBP and SAFIRE and analyzed using a multi-slice channelized Hotelling observer to measure detectability and the dose reduction potential of SAFIRE based on the uniform and textured phantoms. It was found that at the same dose, the improvement in detectability from SAFIRE (compared to FBP) was higher when measured in a uniform phantom compared to textured phantoms.
The final trajectory of this project aimed at developing methods to mathematically model lesions, as a means to help assess image quality directly from patient images. The mathematical modeling framework is first presented. The models describe a lesion’s morphology in terms of size, shape, contrast, and edge profile as an analytical equation. The models can be voxelized and inserted into patient images to create so-called “hybrid” images. These hybrid images can then be used to assess detectability or estimability with the advantage that the ground truth of the lesion morphology and location is known exactly. Based on this framework, a series of liver lesions, lung nodules, and kidney stones were modeled based on images of real lesions. The lesion models were virtually inserted into patient images to create a database of hybrid images to go along with the original database of real lesion images. ROI images from each database were assessed by radiologists in a blinded fashion to determine the realism of the hybrid images. It was found that the radiologists could not readily distinguish between real and virtual lesion images (area under the ROC curve was 0.55). This study provided evidence that the proposed mathematical lesion modeling framework could produce reasonably realistic lesion images.
Based on that result, two studies were conducted which demonstrated the utility of the lesion models. The first study used the modeling framework as a measurement tool to determine how dose and reconstruction algorithm affected the quantitative analysis of liver lesions, lung nodules, and renal stones in terms of their size, shape, attenuation, edge profile, and texture features. The same database of real lesion images used in the previous study was used for this study. That database contained images of the same patient at 2 dose levels (50% and 100%) along with 3 reconstruction algorithms from a GE 750HD CT system (GE Healthcare). The algorithms in question were FBP, Adaptive Statistical Iterative Reconstruction (ASiR), and Model-Based Iterative Reconstruction (MBIR). A total of 23 quantitative features were extracted from the lesions under each condition. It was found that both dose and reconstruction algorithm had a statistically significant effect on the feature measurements. In particular, radiation dose affected five, three, and four of the 23 features (related to lesion size, conspicuity, and pixel-value distribution) for liver lesions, lung nodules, and renal stones, respectively. MBIR significantly affected 9, 11, and 15 of the 23 features (including size, attenuation, and texture features) for liver lesions, lung nodules, and renal stones, respectively. Lesion texture was not significantly affected by radiation dose.
The second study demonstrating the utility of the lesion modeling framework focused on assessing detectability of very low-contrast liver lesions in abdominal imaging. Specifically, detectability was assessed as a function of dose and reconstruction algorithm. As part of a parallel clinical trial, images from 21 patients were collected at 6 dose levels per patient on a SOMATOM Flash scanner. Subtle liver lesion models (contrast = -15 HU) were inserted into the raw projection data from the patient scans. The projections were then reconstructed with FBP and SAFIRE (strength 5). Also, lesion-less images were reconstructed. Noise, contrast, CNR, and detectability index of an observer model (non-prewhitening matched filter) were assessed. It was found that SAFIRE reduced noise by 52%, reduced contrast by 12%, increased CNR by 87%. and increased detectability index by 65% compared to FBP. Further, a 2AFC human perception experiment was performed to assess the dose reduction potential of SAFIRE, which was found to be 22% compared to the standard of care dose.
In conclusion, this dissertation provides to the scientific community a series of new methodologies, phantoms, analysis techniques, and modeling tools that can be used to rigorously assess image quality from modern CT systems. Specifically, methods to properly evaluate iterative reconstruction have been developed and are expected to aid in the safe clinical implementation of dose reduction technologies.
Resumo:
Few symbols of 1950s-1960s America remain as central to our contemporary conception of Cold War culture as the iconic ranch-style suburban home. While the house took center stage in the Nixon/Khrushchev kitchen debates as a symbol of modern efficiency and capitalist values, its popularity depended largely upon its obvious appropriation of vernacular architecture from the 19th century, those California haciendas and Texas dogtrots that dotted the American west. Contractors like William Levitt modernized the historical common houses, hermetically sealing their porous construction, all while using the ranch-style roots of the dwelling to galvanize a myth of an indigenous American culture. At a moment of intense occupational bureaucracy, political uncertainty and atomized social life, the rancher gave a self-identifying white consumer base reason to believe they could master their own plot in the expansive frontier. Only one example of America’s mid-century love affair with commodified vernacular forms, the ranch-style home represents a broad effort on the part of corporate and governmental interest groups to transform the vernacular into a style that expresses a distinctly homogenous vision of American culture. “Other than a Citizen” begins with an anatomy of that transformation, and then turns to the work of four poets who sought to reclaim the vernacular from that process of standardization and use it to countermand the containment-era strategies of Cold War America.
In four chapters, I trace references to common speech and verbal expressivity in the poetry and poetic theory of Charles Olson, Robert Duncan, LeRoi Jones/Amiri Baraka and Gwendolyn Brooks, against the historical backdrop of the Free-Speech Movement and the rise of mass-culture. When poets frame nonliterary speech within the literary page, they encounter the inability of writing to capture the vital ephemerality of verbal expression. Rather than treat this limitation as an impediment, the writers in my study use the poem to dramatize the fugitivity of speech, emphasizing it as a disruptive counterpoint to the technologies of capture. Where critics such as Houston Baker interpret the vernacular strictly in terms of resistance, I take a cue from the poets and argue that the vernacular, rooted etymologically at the intersection of domestic security and enslaved margin, represents a gestalt form, capable at once of establishing centralized power and sparking minor protest. My argument also expands upon Michael North’s exploration of the influence of minstrelsy and regionalism on the development of modernist literary technique in The Dialect of Modernism. As he focuses on writers from the early 20th century, I account for the next generation, whose America was not a culturally inferior collection of immigrants but an imperial power, replete with economic, political and artistic dominance. Instead of settling for an essentially American idiom, the poets in my study saw in the vernacular not phonetic misspellings, slang terminology and fragmented syntax, but the potential to provoke and thereby frame a more ethical mode of social life, straining against the regimentation of citizenship.
My attention to the vernacular argues for an alignment among writers who have been segregated by the assumption that race and aesthetics are mutually exclusive categories. In reading these writers alongside one another, “Other than a Citizen” shows how the avant-garde concepts of projective poetics and composition by field develop out of an interest in black expressivity. Conversely, I trace black radicalism and its emphasis on sociality back to the communalism practiced at the experimental arts college in Black Mountain, North Carolina, where Olson and Duncan taught. In pressing for this connection, my work reveals the racial politics embedded within the speech-based aesthetics of the postwar era, while foregrounding the aesthetic dimension of militant protest.
Not unlike today, the popular rhetoric of the Cold War insists that to be a citizen involves defending one’s status as a rightful member of an exclusionary nation. To be other than a citizen, as the poets in my study make clear, begins with eschewing the false certainty that accompanies categorical nominalization. In promoting a model of mutually dependent participation, these poets lay the groundwork for an alternative model of civic belonging, where volition and reciprocity replace compliance and self-sufficiency. In reading their lines, we become all the more aware of the cracks that run the length of our load-bearing walls.
Resumo:
This dissertation examines the publication history of a single work: John Calvin’s 1552 Quatre sermons de M. Jehan Calvin traictans des matières fort utiles pour nostre temps, avec briefve exposition du Pseaume lxxxvii. Overlooked for both its contribution to Calvin’s wider corpus and its surprising popularity in English translation, successive editions of Quatre sermons display how Calvin’s argument against the behavior of so-called “Nicodemites” was adapted to various purposes unrelated to refuting religious dissimulation. The present study contributes to research in Calvin’s anti-Nicodemism by highlighting the fruitfulness of focusing on a discrete work and its reception. Borrowing a term (“Newter”) from John Field’s 1579 translation of Quatre sermons, this study’s title adumbrates its argument. English translators capitalized on the intrinsic malleability of a nameless and faceless opponent, the Nicodemite, and the adaptability of Quatre sermons’ genre as a collection of sermons to reshape—or, if you will, disfigure—both Calvin’s original foes and his case against them to advance various new agenda. Yet they were not the first to use the reformer’s sermons this way. They could have learned this from Calvin himself.
My examination of Quatre sermons opens by setting the work in the context of Calvin’s other writings and his political situation (Introduction, chapters one and two). Calvin’s unrelenting literary assault on French Nicodemism over three decades has long been recognized for its consistency and negativity. Yet scholars have tended to neglect how Calvin’s polemic against religious dissimulation could exhibit significant flexibility according to the needs of his context. Whereas Calvin’s preface promises simply to revisit his previous argument against participation in the Mass, his approach to Nicodemism in Quatre sermons seems adapted to accomplish goals beyond decrying false worship, offering a carefully-crafted apology for Calvin’s pastoral authority directed at his political situation. Repeatedly emphasizing God’s purpose to bless his children through the ministry of a rightly-ordered church, Quatre sermons marks a shift in Calvin’s anti-Nicodemite rhetoric away from purely negative critique, stressing instead God’s provision of spiritual nurture via political exile. Read in light of Calvin’s 1552 context, two audiences emerge: sermons ostensibly targeting believers in France who hid their faith also appear especially designed to silence Calvin’s foes in Geneva.
The remainder of the study examines the reception of Quatre sermons in the rapidly shifting religious and social contexts of Marian and Elizabethan England, where it appeared in more unique editions than any of Calvin’s writings besides the Institutio and the reformer’s 1542/45 Genevan Catechism. Calvin’s anti-Nicodemism has not been examined for its distinct contribution to the overall English reception of his thought. Five English versions of Quatre sermons appeared between 1553 and 1584—four of these under a Protestant queen, a situation quite different from the French context Calvin addressed. After situating Calvin’s position within the currents of Tudor Protestant anti-Nicodemism (chapter three), I place each of the five translations in its particular context, investigating prefaces, appendices, marginalia, and translation methods to discover how and why individuals used Quatre sermons (chapters four to six). Like Calvin in 1552, those who brought Quatre sermons to English readers were not primarily concerned with Nicodemism. Rather, the malleability of Calvin’s Nicodemite as polemical opponent and the flexibility of Quatre sermons as a sequence of discrete, interrelated parts made it popular with those eager to press Calvin into the service of a variety of diverse goals he could not have imagined, including turning his anti-Nicodemism against fellow members of the English church.
Resumo:
BACKGROUND: In light of evidence showing reduced criminal recidivism and cost savings, adult drug treatment courts have grown in popularity. However, the potential spillover benefits to family members are understudied. OBJECTIVES: To examine: (1) the overlap between parents who were convicted of a substance-related offense and their children's involvement with child protective services (CPS); and (2) whether parental participation in an adult drug treatment court program reduces children's risk for CPS involvement. METHODS: Administrative data from North Carolina courts, birth records, and social services were linked at the child level. First, children of parents convicted of a substance-related offense were matched to (a) children of parents convicted of a nonsubstance-related offense and (b) those not convicted of any offense. Second, we compared children of parents who completed a DTC program with children of parents who were referred but did not enroll, who enrolled for <90 days but did not complete, and who enrolled for 90+ days but did not complete. Multivariate logistic regression was used to model group differences in the odds of being reported to CPS in the 1 to 3 years following parental criminal conviction or, alternatively, being referred to a DTC program. RESULTS: Children of parents convicted of a substance-related offense were at greater risk of CPS involvement than children whose parents were not convicted of any charge, but DTC participation did not mitigate this risk. Conclusion/Importance: The role of specialty courts as a strategy for reducing children's risk of maltreatment should be further explored.
Resumo:
Cette recherche constitue un essai de théorie critique féministe matérialiste et radicale. Elle poursuit principalement un objectif de dénonciation de la structure actuelle du droit du logement. À partir d’un cadre conceptuel fondé sur le féminisme matérialiste et radical, elle souhaite faire ressortir le point de vue de la classe des femmes dans l’habitation. Le droit du logement est ici utilisé dans un sens large, puisqu’il se réfère à la fois au logement comme phénomène juridique, mais aussi sociologique. À l’intérieur de la discipline juridique, il renvoie à l’ensemble des législations actuellement en vigueur au Québec en ce qui concerne la vie à domicile. Notre étude se concentre sur deux modes d’occupation des lieux, à travers le droit de propriété et le système locatif. Le droit au logement fait l’objet d’une reconnaissance internationale dans les textes portant sur les droits humains. Il est reconnu comme le « droit à un logement suffisant ». Au Canada et au Québec, il ne fait pas l’objet d’une reconnaissance explicite, malgré les engagements pris sur la scène internationale. Un portrait statistique, appuyé sur le critère du sexe, permet de mettre en évidence qu’il existe des écarts entre les hommes et les femmes en ce qui concerne la mise en application du droit du logement. Les femmes accèdent plus difficilement à un logement; elles y effectuent la majorité du travail domestique, de service et de « care » et elles sont les principales victimes des violences commises à domicile. Dans le système d’habitation, l’expérience des femmes se comprend comme une appropriation à la fois privée et collective par la classe des hommes, telle que réfléchie par Colette Guillaumin, qui se concentre autour de la division sexuelle du travail et des violences sexuées. Le droit du logement, dans sa forme actuelle, repose sur l’appropriation de la force de travail des femmes et de leur corps. Ces deux critères permettent de construire une grille d’analyse féministe matérialiste et radicale pour analyser la structure du droit du logement, tel que conçu en droit civil. Cette analyse féministe permet également de situer le droit étatique comme une pratique patriarcale. Cette dernière contribue à assurer le maintien du système d’habitation, qui est assimilable à un système hégémonique, au sens développé par Gramsci. Cette étude réfléchit sur le droit du logement dans le climat politique néolibéral. Le néolibéralisme est développé comme une idéologie qui impose une rationalité marchande à l’ensemble des politiques étatiques. À partir d’une méthode décrite comme métathéorique externe radicalement réflexive, puisqu’elle propose l’importation d’outils conceptuels étrangers à la discipline du droit moderne, nous réfléchissons de manière radicale la construction du droit civil et des institutions qui encadrent le droit du logement. La collecte des données s’effectue à partir de la recherche documentaire. Quatre institutions du droit civil seront examinées dans le détail, soit le sujet du droit, la dichotomie privé/public, la médiation du droit du logement par les biens immeubles, à travers le rapport contractuel et le droit de propriété, et finalement les notaires. L’analyse féministe du sujet du droit insiste sur un paradoxe. D’une part, l’universalité présumée de ce sujet, laquelle permet de poser l’égalité et la liberté pour toutes les personnes juridiques. Or, plutôt que d’être neutre sexuellement comme le prétend le droit positif, nous démontrons comment ce sujet est constamment un membre de la classe des hommes. D’autre part, nous analysons comment le droit reconnaît le sexe de ses sujets, mais surtout comment cette sexualité est construite sur l’idéologie naturaliste. Ce modèle de sujet masculin est fondamental dans la construction du droit du logement. L’étude féministe de la dichotomie privé/public en fait ressortir le caractère situé. En effet, si par essence aucun domaine ou enjeu n’est en soit privé ou public, le processus de qualification, lui, est un acte de pouvoir. Nous verrons comment le droit civil crée des zones de droit privé, comprises comme des zones de non-droit pour les femmes. La qualification de privé dévalue également le travail accompli par cette classe de sexe. Le droit du logement est pourtant centré sur le rapport contractuel et sur le droit de propriété. Il importe alors d’examiner la nature du consentement donné par les femmes comme groupe social dans les contrats de vente et de location. Ces contrats ne prennent pas en compte l’expérience des femmes dans leur formation. Les catégories qui y sont attachées, telles que vendeur.e ou locataire, représentent le point de vue de la classe des hommes. Bien que la popularité de la copropriété auprès de la classe des femmes semble porteuse d’un vent de changement, nous analysons comment le discours dominant qui l’entoure instrumentalise certaines revendications féministes, tout en laissant dans l’ombre la question du travail domestique et des violences sexuées. Finalement, nous nous intéressons aux notaires en les repensant comme des intellectuel.les organiques, tels que conçu.es par Gramsci, pour la classe des hommes. Cette fonction d’intellectuel.les permet de mettre en lumière comment chaque transaction immobilière favorise la reproduction des intérêts patriarcaux, remettant ainsi en question la nature des devoirs de conseil et d’impartialité du notariat. À la lumière de cette analyse, le Code civil du Québec est qualifié dans une perspective féministe matérialiste et radicale pour devenir un système qui institutionnalise l’appropriation des femmes par l’entremise du droit du logement. Ce travail de recherche permet d’envisager certaines pistes de réflexion pour des rénovations potentielles des pratiques juridiques entourant le droit du logement, notamment la pratique notariale, tournées vers des objectifs féministes de justice sociale.
Resumo:
The purpose of this thesis is to evaluate and refute Yvonne Griggs’ claims that the films “House of Strangers” (1949) and “Broken Lance” (1954) are as Griggs deems “genre-based adaptations” of William Shakespeare’s “King Lear.” I argue that the films, although they have some essential elements of “King Lear,” lack intentionality and reception, pivotal components in determining viability as a Shakespearean film adaptation. Using Griggs’ book as my critical background, I will show that these films are better classified under their respective genre categories, Western and film noir, not as “King Lear” genre adaptations. I will also suggest criteria for determining the level of canonicity of a “King Lear” film adaptation. Popularity of films does not determine validity, and a film does not need purported Shakespearean provenance to validate its ratings. Some films, like these, merely reference or pay homage to Shakespeare through use of essential elements of “King Lear”; here, I deem such affinities to be more unintentional than intentional.
Resumo:
There has been a significant increase in the incidence of musculoskeletal disorders (MSD) and the costs associated with these are predicted to increase as the popularity of computer use increases at home, school and work. Risk factors have been identified in the adult population but little is known about the risk factors for children and youth. Research has demonstrated that they are not immune to this risk and that they are self reporting the same pain as adults. The purpose of the study was to examine children’s postures while working at computer workstations under two conditions. One was at an ergonomically adjusted children’s workstation while the second was at an average adult workstation. A Polhemus Fastrak™ system was used to record the children’s postures and joint and segment angles were quantified. Results of the study showed that children reported more discomfort and effort at the adult workstation. Segment and joint angles showed significant differences through the upper limb at the adult workstation. Of significance was the strategy of shoulder abduction and flexion that the children used in order to place their hand on the mouse. Ulnar deviation was also greater at the adult workstation as was neck extension. All of these factors have been identified in the literature as increasing the risk for injury. A comparison of the children’s posture while playing at the children’s workstation verses the adult workstation, showed that the postural angles assumed by the children at an adult workstation exceeded the Occupational Safety and Health Association (OSHA) recommendations. Further investigation is needed to increase our knowledge of MSD in children as their potential for long term damage has yet to be determined.
Resumo:
Abstract Complexity science and its methodological applications have increased in popularity in social science during the last two decades. One key concept within complexity science is that of self-organization. Self-organization is used to refer to the emergence of stable patterns through autonomous and self-reinforcing dynamics at the micro-level. In spite of its potential relevance for the study of social dynamics, the articulation and use of the concept of self-organization has been kept within the boundaries of complexity science and links to and from mainstream social science are scarce. These links can be difficult to establish, even for researchers working in social complexity with a background in social science, because of the theoretical and conceptual diversity and fragmentation in traditional social science. This article is meant to serve as a first step in the process of overcoming this lack of cross-fertilization between complexity and mainstream social science. A systematic review of the concept of self-organization and a critical discussion of similar notions in mainstream social science is presented, in an effort to help practitioners within subareas of complexity science to identify literature from traditional social science that could potentially inform their research.
Resumo:
The emerging technologies have expanded a new dimension of self – ‘technoself’ driven by socio-technical innovations and taken an important step forward in pervasive learning. Technology Enhanced Learning (TEL) research has increasingly focused on emergent technologies such as Augmented Reality (AR) for augmented learning, mobile learning, and game-based learning in order to improve self-motivation and self-engagement of the learners in enriched multimodal learning environments. These researches take advantage of technological innovations in hardware and software across different platforms and devices including tablets, phoneblets and even game consoles and their increasing popularity for pervasive learning with the significant development of personalization processes which place the student at the center of the learning process. In particular, augmented reality (AR) research has matured to a level to facilitate augmented learning, which is defined as an on-demand learning technique where the learning environment adapts to the needs and inputs from learners. In this paper we firstly study the role of Technology Acceptance Model (TAM) which is one of the most influential theories applied in TEL on how learners come to accept and use a new technology. Then we present the design methodology of the technoself approach for pervasive learning and introduce technoself enhanced learning as a novel pedagogical model to improve student engagement by shaping personal learning focus and setting. Furthermore we describe the design and development of an AR-based interactive digital interpretation system for augmented learning and discuss key features. By incorporating mobiles, game simulation, voice recognition, and multimodal interaction through Augmented Reality, the learning contents can be geared toward learner's needs and learners can stimulate discovery and gain greater understanding. The system demonstrates that Augmented Reality can provide rich contextual learning environment and contents tailored for individuals. Augment learning via AR can bridge this gap between the theoretical learning and practical learning, and focus on how the real and virtual can be combined together to fulfill different learning objectives, requirements, and even environments. Finally, we validate and evaluate the AR-based technoself enhanced learning approach to enhancing the student motivation and engagement in the learning process through experimental learning practices. It shows that Augmented Reality is well aligned with constructive learning strategies, as learners can control their own learning and manipulate objects that are not real in augmented environment to derive and acquire understanding and knowledge in a broad diversity of learning practices including constructive activities and analytical activities.
Resumo:
Social media is changing the way we interact, present ideas and information and judge the quality of content and contributions. In recent years there have been hundreds of platforms to freely share all kinds of information and connect across networks. These new tools generate activity statistics and interactions among users such as mentions, retweets, conversations, comments on blogs or Facebook; managers references showing popularity ratings of more references shared by other researchers or repositories that generate statistics of visits or downloads of articles. This paper analyzes that have meaning and implications altmetrics, what are its advantages and critical platforms (Almetric.com, ImpactStory, Plos altmetrics, PlumX), reports progress and benefits for authors, publishers and librarians. It concluded that the value of alternative metrics as a complementary tool citation analysis is evident, although it is suggested that you should dig deeper into this issue to unravel the meaning and the potential value of these indicators to assess their potential.
Resumo:
Las Universidades han tenido que adaptarse a los nuevos modelos de comunicación surgidos en la época de Internet. Dentro de estos nuevos paradigmas las redes sociales han irrumpido y Twitter se ha establecido como una de las más importantes. El objetivo de esta investigación es demostrar que existe una relación entre la presencia online de una Universidad, definida por la cantidad de información disponible en Internet, y su cuenta en Twitter. Para ello se analizó la relación entre la presencia online y los perfiles oficiales de las cinco universidades del País Vasco y Navarra. Los resultados demostraron la existencia de una correlación significativa entre la presencia online de las instituciones y el número de seguidores de sus respectivas cuentas. En segundo lugar, esta investigación se planteó si Twitter puede servir para potenciar la presencia online de una Universidad. Es por eso que se formuló una segunda hipótesis que buscaba analizar si tener varias cuentas en Twitter aumentaría la presencia online de las Universidades. Los hallazgos para esta segunda hipótesis demostraron una correlación muy significativa entre tener varios perfiles en Twitter y la presencia online de las Universidades. Así queda demostrada la importancia de la presencia online para las cuentas de Twitter y la relevancia de Twitter a la hora de potenciar la presencia online de los centros.
Resumo:
Populist radical right parties have become major political actors in Europe. This paper analyses the path and the different phases that have led them from the fringes of public debate to their present signifi cance, which is based on their capacity to attract electoral support and infl uence the political agendas in their respective countries. Besides, an analysis of the core ideological beliefs of these parties, and of the topics on which their mobilization capacity rests, is provided, as well as of the type of voters that are attracted by them. Finally, the authors discuss the meaning and impact of the growing popularity of the ideas and proposals put forward by the populist radical right parties.
Resumo:
This paper deals with the conceptions of the different school actors about the meaning and the implications of mediation in their schools, drawing on data from a qualitative approach carried out as part of a wider project to map mediation perspectives and practices in Catalonia. The authors analyze the scope of the situations regarded as suitable or unsuitable for the introduction of restorative practices, as well as the resistance to change in the practice of conflict resolutions and in the democratization of school culture.