852 resultados para specific language impairment


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Recently, blood oxygen level-dependent (BOLD) functional magnetic resonance imaging (fMRI) has become a routine clinical procedure for localization of language and motor brain regions and has been replacing more invasive preoperative procedures. However, the fMRI results from these tasks are not always reproducible even from the same patient. Evaluating the reproducibility of language and speech mapping is especially complicated due to the complex brain circuitry that may become activated during the functional task. Non-language areas such as sensory, attention, decision-making, and motor brain regions may also be activated in addition to the specific language regions during a traditional sentence-completion task. In this study, I test a new approach, which utilizes 4-minute video-based tasks, to map language and speech brain regions for patients undergoing brain surgery. Results from 35 subjects have shown that the video-based task activates Wernicke’s area, as well as Broca’s area in most subjects. The computed laterality indices, which indicate the dominant hemisphere from that functional task, have indicated left dominance from the video-based tasks. This study has shown that the video-based task may be an alternative method for localization of language and speech brain regions for patients who are unable to complete the sentence-completion task.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Users seeking information may not find relevant information pertaining to their information need in a specific language. But information may be available in a language different from their own, but users may not know that language. Thus users may experience difficulty in accessing the information present in different languages. Since the retrieval process depends on the translation of the user query, there are many issues in getting the right translation of the user query. For a pair of languages chosen by a user, resources, like incomplete dictionary, inaccurate machine translation system may exist. These resources may be insufficient to map the query terms in one language to its equivalent terms in another language. Also for a given query, there might exist multiple correct translations. The underlying corpus evidence may suggest a clue to select a probable set of translations that could eventually perform a better information retrieval. In this paper, we present a cross language information retrieval approach to effectively retrieve information present in a language other than the language of the user query using the corpus driven query suggestion approach. The idea is to utilize the corpus based evidence of one language to improve the retrieval and re-ranking of news documents in the other language. We use FIRE corpora - Tamil and English news collections in our experiments and illustrate the effectiveness of the proposed cross language information retrieval approach.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Current trends in speech-language pathology focus on early intervention as the preferred tool for promoting the best possible outcomes in children with language disorders. Neuroimaging techniques are being studied as promising tools for flagging at-risk infants. In this study, the auditory brainstem response (ABR) to the syllables /ba/ and /ga/ was examined in 41 infants between 3 and 12 months of age as a possible tool to predict language development in toddlerhood. The MacArthur-Bates Communicative Development Inventory (MCDI) was used to assess language development at 18 months of age. The current study compared the periodicity of the responses to the stop consonants and phase differences between /ba/ and /ga/ in both at-risk and low-risk groups. The study also examined whether there are correlations among ABR measures (periodicity and phase differentiation) and language development. The study found that these measures predict language development at 18 months.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Secure Multi-party Computation (MPC) enables a set of parties to collaboratively compute, using cryptographic protocols, a function over their private data in a way that the participants do not see each other's data, they only see the final output. Typical MPC examples include statistical computations over joint private data, private set intersection, and auctions. While these applications are examples of monolithic MPC, richer MPC applications move between "normal" (i.e., per-party local) and "secure" (i.e., joint, multi-party secure) modes repeatedly, resulting overall in mixed-mode computations. For example, we might use MPC to implement the role of the dealer in a game of mental poker -- the game will be divided into rounds of local decision-making (e.g. bidding) and joint interaction (e.g. dealing). Mixed-mode computations are also used to improve performance over monolithic secure computations. Starting with the Fairplay project, several MPC frameworks have been proposed in the last decade to help programmers write MPC applications in a high-level language, while the toolchain manages the low-level details. However, these frameworks are either not expressive enough to allow writing mixed-mode applications or lack formal specification, and reasoning capabilities, thereby diminishing the parties' trust in such tools, and the programs written using them. Furthermore, none of the frameworks provides a verified toolchain to run the MPC programs, leaving the potential of security holes that can compromise the privacy of parties' data. This dissertation presents language-based techniques to make MPC more practical and trustworthy. First, it presents the design and implementation of a new MPC Domain Specific Language, called Wysteria, for writing rich mixed-mode MPC applications. Wysteria provides several benefits over previous languages, including a conceptual single thread of control, generic support for more than two parties, high-level abstractions for secret shares, and a fully formalized type system and operational semantics. Using Wysteria, we have implemented several MPC applications, including, for the first time, a card dealing application. The dissertation next presents Wys*, an embedding of Wysteria in F*, a full-featured verification oriented programming language. Wys* improves on Wysteria along three lines: (a) It enables programmers to formally verify the correctness and security properties of their programs. As far as we know, Wys* is the first language to provide verification capabilities for MPC programs. (b) It provides a partially verified toolchain to run MPC programs, and finally (c) It enables the MPC programs to use, with no extra effort, standard language constructs from the host language F*, thereby making it more usable and scalable. Finally, the dissertation develops static analyses that help optimize monolithic MPC programs into mixed-mode MPC programs, while providing similar privacy guarantees as the monolithic versions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In 2010 a couple in Cairns were charged, and later found not guilty, of illegally obtaining a medical abortion through the use of medication imported from overseas. The court case reignited the contentious debate surrounding the illegality and social acceptance of abortion in Queensland, Australia. Based on a critical discourse analysis of 150 online news media articles covering the Cairns trial, this paper argues that the media shapes perceptions of deviance and stigma in relation to abortion through the use of language. In this case, the Cairns couple were positioned as deviant for pursuing abortion on the basis that they were rejecting the social norm of motherhood. This paper identifies three key themes evident in the articles analysed which contribute to shaping the construction of deviance – the humanising of the foetus, the stereotyping of the traditional female role of mother, and the demonising of women who choose abortion. This paper argues that the use of specific language in media coverage of abortion has the power to disrespect and invalidate the experiences, rights, and health of women who choose to terminate pregnancies.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Social system-level analyses of journalism have tended to focus on political and economic influences, at the expense of other factors, such as the role that culture and cultural values play in shaping journalists' professional views and practices. This paper identifies cultural values as a particularly fruitful area for providing a more nuanced analysis of journalism culture. It examines this issue in the context of in-depth interviews with 20 M?ori journalists from Aotearoa New Zealand. The study finds that Indigenous journalism in that country is strongly influenced by M?ori cultural values, such as showing respect to others, following cultural protocols, and making use of culturally-specific language. Cultural limitations are also identified in the form of the social structures of M?ori society, and journalists' strategies in working around these are discussed. The paper highlights the implications a renewed focus on cultural values can have for the study of journalism culture more broadly.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents the design and implementation of PolyMage, a domain-specific language and compiler for image processing pipelines. An image processing pipeline can be viewed as a graph of interconnected stages which process images successively. Each stage typically performs one of point-wise, stencil, reduction or data-dependent operations on image pixels. Individual stages in a pipeline typically exhibit abundant data parallelism that can be exploited with relative ease. However, the stages also require high memory bandwidth preventing effective utilization of parallelism available on modern architectures. For applications that demand high performance, the traditional options are to use optimized libraries like OpenCV or to optimize manually. While using libraries precludes optimization across library routines, manual optimization accounting for both parallelism and locality is very tedious. The focus of our system, PolyMage, is on automatically generating high-performance implementations of image processing pipelines expressed in a high-level declarative language. Our optimization approach primarily relies on the transformation and code generation capabilities of the polyhedral compiler framework. To the best of our knowledge, this is the first model-driven compiler for image processing pipelines that performs complex fusion, tiling, and storage optimization automatically. Experimental results on a modern multicore system show that the performance achieved by our automatic approach is up to 1.81x better than that achieved through manual tuning in Halide, a state-of-the-art language and compiler for image processing pipelines. For a camera raw image processing pipeline, our performance is comparable to that of a hand-tuned implementation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Traditional software development captures the user needs during the requirement analysis. The Web makes this endeavour even harder due to the difficulty to determine who these users are. In an attempt to tackle the heterogeneity of the user base, Web Personalization techniques are proposed to guide the users’ experience. In addition, Open Innovation allows organisations to look beyond their internal resources to develop new products or improve existing processes. This thesis sits in between by introducing Open Personalization as a means to incorporate actors other than webmasters in the personalization of web applications. The aim is to provide the technological basis that builds up a trusty environment for webmasters and companion actors to collaborate, i.e. "an architecture of participation". Such architecture very much depends on these actors’ profile. This work tackles three profiles (i.e. software partners, hobby programmers and end users), and proposes three "architectures of participation" tuned for each profile. Each architecture rests on different technologies: a .NET annotation library based on Inversion of Control for software partners, a Modding Interface in JavaScript for hobby programmers, and finally, a domain specific language for end-users. Proof-of-concept implementations are available for the three cases while a quantitative evaluation is conducted for the domain specific language.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Cyber-physical systems integrate computation, networking, and physical processes. Substantial research challenges exist in the design and verification of such large-scale, distributed sensing, ac- tuation, and control systems. Rapidly improving technology and recent advances in control theory, networked systems, and computer science give us the opportunity to drastically improve our approach to integrated flow of information and cooperative behavior. Current systems rely on text-based spec- ifications and manual design. Using new technology advances, we can create easier, more efficient, and cheaper ways of developing these control systems. This thesis will focus on design considera- tions for system topologies, ways to formally and automatically specify requirements, and methods to synthesize reactive control protocols, all within the context of an aircraft electric power system as a representative application area.

This thesis consists of three complementary parts: synthesis, specification, and design. The first section focuses on the synthesis of central and distributed reactive controllers for an aircraft elec- tric power system. This approach incorporates methodologies from computer science and control. The resulting controllers are correct by construction with respect to system requirements, which are formulated using the specification language of linear temporal logic (LTL). The second section addresses how to formally specify requirements and introduces a domain-specific language for electric power systems. A software tool automatically converts high-level requirements into LTL and synthesizes a controller.

The final sections focus on design space exploration. A design methodology is proposed that uses mixed-integer linear programming to obtain candidate topologies, which are then used to synthesize controllers. The discrete-time control logic is then verified in real-time by two methods: hardware and simulation. Finally, the problem of partial observability and dynamic state estimation is ex- plored. Given a set placement of sensors on an electric power system, measurements from these sensors can be used in conjunction with control logic to infer the state of the system.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Alligator mississippiensis (American Alligators) demonstrated low hatchrate success and increased adult mortality on Lake Griffin, FL, between 1998 and 2003. Dying Lake Griffin alligators with symptoms of poor motor coordination were reported to show specific neurological impairment and brain lesions. Similar lesions were documented in salmonines that consumed clupeids with high thiaminase levels. Therefore, we investigated the diet of Lake Griffin alligators and compared it with alligator diets from two lakes that exhibited relatively low levels of unexplained alligator mortality to see if consumption of Dorosoma cepedianum (gizzard shad) could be correlated with patterns of mortality. Shad in both lakes Griffin and Apopka had high levels of thiaminase and Lake Apopka alligators were consuming greater amounts of shad relative to Lake Griffin without showing mortality rates similar to Lake Griffin alligators. Therefore, a relationship between shad consumption alone and alligator mortality is not supported.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Motivated by the design and development challenges of the BART case study, an approach for developing and analyzing a formal model for reactive systems is presented. The approach makes use of a domain specific language for specifying control algorithms able to satisfy competing properties such as safety and optimality. The domain language, called SPC, offers several key abstractions such as the state, the profile, and the constraint to facilitate problem specification. Using a high-level program transformation system such as HATS being developed at the University of Nebraska at Omaha, specifications in this modelling language can be transformed to ML code. The resulting executable specification can be further refined by applying generic transformations to the abstractions provided by the domain language. Problem dependent transformations utilizing the domain specific knowledge and properties may also be applied. The result is a significantly more efficient implementation which can be used for simulation and gaining deeper insight into design decisions and various control policies. The correctness of transformations can be established using a rewrite-rule based induction theorem prover Rewrite Rule Laboratory developed at the University of New Mexico.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

snBench is a platform on which novice users compose and deploy distributed Sense and Respond programs for simultaneous execution on a shared, distributed infrastructure. It is a natural imperative that we have the ability to (1) verify the safety/correctness of newly submitted tasks and (2) derive the resource requirements for these tasks such that correct allocation may occur. To achieve these goals we have established a multi-dimensional sized type system for our functional-style Domain Specific Language (DSL) called Sensor Task Execution Plan (STEP). In such a type system data types are annotated with a vector of size attributes (e.g., upper and lower size bounds). Tracking multiple size aspects proves essential in a system in which Images are manipulated as a first class data type, as image manipulation functions may have specific minimum and/or maximum resolution restrictions on the input they can correctly process. Through static analysis of STEP instances we not only verify basic type safety and establish upper computational resource bounds (i.e., time and space), but we also derive and solve data and resource sizing constraints (e.g., Image resolution, camera capabilities) from the implicit constraints embedded in program instances. In fact, the static methods presented here have benefit beyond their application to Image data, and may be extended to other data types that require tracking multiple dimensions (e.g., image "quality", video frame-rate or aspect ratio, audio sampling rate). In this paper we present the syntax and semantics of our functional language, our type system that builds costs and resource/data constraints, and (through both formalism and specific details of our implementation) provide concrete examples of how the constraints and sizing information are used in practice.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

NetSketch is a tool that enables the specification of network-flow applications and the certification of desirable safety properties imposed thereon. NetSketch is conceived to assist system integrators in two types of activities: modeling and design. As a modeling tool, it enables the abstraction of an existing system so as to retain sufficient enough details to enable future analysis of safety properties. As a design tool, NetSketch enables the exploration of alternative safe designs as well as the identification of minimal requirements for outsourced subsystems. NetSketch embodies a lightweight formal verification philosophy, whereby the power (but not the heavy machinery) of a rigorous formalism is made accessible to users via a friendly interface. NetSketch does so by exposing tradeoffs between exactness of analysis and scalability, and by combining traditional whole-system analysis with a more flexible compositional analysis approach based on a strongly-typed, Domain-Specific Language (DSL) to specify network configurations at various levels of sketchiness along with invariants that need to be enforced thereupon. In this paper, we overview NetSketch, highlight its salient features, and illustrate how it could be used in applications, including the management/shaping of traffic flows in a vehicular network (as a proxy for CPS applications) and in a streaming media network (as a proxy for Internet applications). In a companion paper, we define the formal system underlying the operation of NetSketch, in particular the DSL behind NetSketch's user-interface when used in "sketch mode", and prove its soundness relative to appropriately-defined notions of validity.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

NetSketch is a tool for the specification of constrained-flow applications and the certification of desirable safety properties imposed thereon. NetSketch is conceived to assist system integrators in two types of activities: modeling and design. As a modeling tool, it enables the abstraction of an existing system while retaining sufficient information about it to carry out future analysis of safety properties. As a design tool, NetSketch enables the exploration of alternative safe designs as well as the identification of minimal requirements for outsourced subsystems. NetSketch embodies a lightweight formal verification philosophy, whereby the power (but not the heavy machinery) of a rigorous formalism is made accessible to users via a friendly interface. NetSketch does so by exposing tradeoffs between exactness of analysis and scalability, and by combining traditional whole-system analysis with a more flexible compositional analysis. The compositional analysis is based on a strongly-typed Domain-Specific Language (DSL) for describing and reasoning about constrained-flow networks at various levels of sketchiness along with invariants that need to be enforced thereupon. In this paper, we define the formal system underlying the operation of NetSketch, in particular the DSL behind NetSketch's user-interface when used in "sketch mode", and prove its soundness relative to appropriately-defined notions of validity. In a companion paper [6], we overview NetSketch, highlight its salient features, and illustrate how it could be used in two applications: the management/shaping of traffic flows in a vehicular network (as a proxy for CPS applications) and in a streaming media network (as a proxy for Internet applications).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In research areas involving mathematical rigor, there are numerous benefits to adopting a formal representation of models and arguments: reusability, automatic evaluation of examples, and verification of consistency and correctness. However, accessibility has not been a priority in the design of formal verification tools that can provide these benefits. In earlier work [30] we attempt to address this broad problem by proposing several specific design criteria organized around the notion of a natural context: the sphere of awareness a working human user maintains of the relevant constructs, arguments, experiences, and background materials necessary to accomplish the task at hand. In this report we evaluate our proposed design criteria by utilizing within the context of novel research a formal reasoning system that is designed according to these criteria. In particular, we consider how the design and capabilities of the formal reasoning system that we employ influence, aid, or hinder our ability to accomplish a formal reasoning task – the assembly of a machine-verifiable proof pertaining to the NetSketch formalism. NetSketch is a tool for the specification of constrained-flow applications and the certification of desirable safety properties imposed thereon. NetSketch is conceived to assist system integrators in two types of activities: modeling and design. It provides capabilities for compositional analysis based on a strongly-typed domain-specific language (DSL) for describing and reasoning about constrained-flow networks and invariants that need to be enforced thereupon. In a companion paper [13] we overview NetSketch, highlight its salient features, and illustrate how it could be used in actual applications. In this paper, we define using a machine-readable syntax major parts of the formal system underlying the operation of NetSketch, along with its semantics and a corresponding notion of validity. We then provide a proof of soundness for the formalism that can be partially verified using a lightweight formal reasoning system that simulates natural contexts. A traditional presentation of these definitions and arguments can be found in the full report on the NetSketch formalism [12].