878 resultados para NELIAC (Computer program language)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this research is to synthesize structural composites designed with particular areas defined with custom modulus, strength and toughness values in order to improve the overall mechanical behavior of the composite. Such composites are defined and referred to as 3D-designer composites. These composites will be formed from liquid crystalline polymers and carbon nanotubes. The fabrication process is a variation of rapid prototyping process, which is a layered, additive-manufacturing approach. Composites formed using this process can be custom designed by apt modeling methods for superior performance in advanced applications. The focus of this research is on enhancement of Young's modulus in order to make the final composite stiffer. Strength and toughness of the final composite with respect to various applications is also discussed. We have taken into consideration the mechanical properties of final composite at different fiber volume content as well as at different orientations and lengths of the fibers. The orientation of the LC monomers is supposed to be carried out using electric or magnetic fields. A computer program is modeled incorporating the Mori-Tanaka modeling scheme to generate the stiffness matrix of the final composite. The final properties are then deduced from the stiffness matrix using composite micromechanics. Eshelby's tensor, required to calculate the stiffness tensor using Mori-Tanaka method, is calculated using a numerical scheme that determines the components of the Eshelby's tensor (Gavazzi and Lagoudas 1990). The numerical integration is solved using Gaussian Quadrature scheme and is worked out using MATLAB as well. . MATLAB provides a good deal of commands and algorithms that can be used efficiently to elaborate the continuum of the formula to its extents. Graphs are plotted using different combinations of results and parameters involved in finding these results

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction: Knowing the experience of abuse, contextual determinants that led to the rupture of the situation and attempts to build a more harmonious future, it is essential to work sensitivities and better understand victims of domestic violence. Objectives: To understand the suffering of women victims of violence. Methods: This is an intentional sample of 21 women who were at shelter home or in the community. The data were collected by in- Documento descargado de http://www.elsevier.es el 13-10-2016 3rd World Congress of Health Research 21 terviews, guided by a script organized into four themes. The interviews were conducted with audio record, the permission of the participants were fully passed the text and analyzed as two different corpuses, depending on the context in which they occurred. The analysis was conducted using the ALCESTE computer program. The study obtained a favorable opinion of the Committee on Health and Welfare of the University of Évora. Results: From the irst sample analysis emerged ive classes. The association of the words gave the meaning of each class that we have appointed as Class 1 - Precipitating Events; Class 2 - Experience of abuse; Class 3 - Two feet in the present and looking into the future; Class 4 - The present and learning from the experience of abuse; and Class 5 - Violence in general. From the analysis of the sample in the community four classes emerged that we have appointed as Class 1 - Violence in general; Class 2 - Precipitating Events; Class 3 - abuse of experience; and class 4 - Support in the process. Conclusions: Women who are at shelter home have this experience of violence and its entire context a lot are very focused on their experiences and the future is distant and unclear. Women in the community have a more comprehensive view of the phenomenon of violence as a whole, they can decentralize to their personal experiences and recognize the importance of support in the future construction process.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

The CIL compiler for core Standard ML compiles whole programs using a novel typed intermediate language (TIL) with intersection and union types and flow labels on both terms and types. The CIL term representation duplicates portions of the program where intersection types are introduced and union types are eliminated. This duplication makes it easier to represent type information and to introduce customized data representations. However, duplication incurs compile-time space costs that are potentially much greater than are incurred in TILs employing type-level abstraction or quantification. In this paper, we present empirical data on the compile-time space costs of using CIL as an intermediate language. The data shows that these costs can be made tractable by using sufficiently fine-grained flow analyses together with standard hash-consing techniques. The data also suggests that non-duplicating formulations of intersection (and union) types would not achieve significantly better space complexity.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

The effects oftwo types of small-group communication, synchronous computer-mediated and face-to-face, on the quantity and quality of verbal output were con^ared. Quantity was deiSned as the number of turns taken per minute, the number of Analysis-of-Speech units (AS-units) produced per minute, and the number ofwords produced per minute. Quality was defined as the number of words produced per AS-unit. In addition, the interaction of gender and type of communication was explored for any differences that existed in the output produced. Questionnaires were also given to participants to determine attitudes toward computer-mediated and face-to-face communication. Thirty intermediate-level students fi-om the Intensive English Language Program (lELP) at Brock University participated in the study, including 15 females and 15 males. Nonparametric tests, including the Wilcoxon matched-pairs test, Mann-Whitney U test, and Friedman test were used to test for significance at the p < .05 level. No significant differences were found in the effects of computer-mediated and face-to-face communication on the output produced during follow-up speaking sessions. However, the quantity and quality of interaction was significantly higher during face-to-face sessions than computer-mediated sessions. No significant differences were found in the output produced by males and females in these 2 conditions. While participants felt that the use of computer-mediated communication may aid in the development of certain language skills, they generally preferred face-to-face communication. These results differed fi-om previous studies that found a greater quantity and quality of output in addition to a greater equality of interaction produced during computer-mediated sessions in comparison to face-to-face sessions (Kern, 1995; Warschauer, 1996).

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Tensor3D is a geometric modeling program with the capacity to simulate and visualize in real-time the deformation, specified through a tensor matrix and applied to triangulated models representing geological bodies. 3D visualization allows the study of deformational processes that are traditionally conducted in 2D, such as simple and pure shears. Besides geometric objects that are immediately available in the program window, the program can read other models from disk, thus being able to import objects created with different open-source or proprietary programs. A strain ellipsoid and a bounding box are simultaneously shown and instantly deformed with the main object. The principal axes of strain are visualized as well to provide graphical information about the orientation of the tensor's normal components. The deformed models can also be saved, retrieved later and deformed again, in order to study different steps of progressive strain, or to make this data available to other programs. The shape of stress ellipsoids and the corresponding Mohr circles defined by any stress tensor can also be represented. The application was written using the Visualization ToolKit, a powerful scientific visualization library in the public domain. This development choice, allied to the use of the Tcl/Tk programming language, which is independent on the host computational platform, makes the program a useful tool for the study of geometric deformations directly in three dimensions in teaching as well as research activities. (C) 2007 Elsevier Ltd. All rights reserved.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Field-Programmable Gate Arrays (FPGAs) are becoming increasingly important in embedded and high-performance computing systems. They allow performance levels close to the ones obtained with Application-Specific Integrated Circuits, while still keeping design and implementation flexibility. However, to efficiently program FPGAs, one needs the expertise of hardware developers in order to master hardware description languages (HDLs) such as VHDL or Verilog. Attempts to furnish a high-level compilation flow (e.g., from C programs) still have to address open issues before broader efficient results can be obtained. Bearing in mind an FPGA available resources, it has been developed LALP (Language for Aggressive Loop Pipelining), a novel language to program FPGA-based accelerators, and its compilation framework, including mapping capabilities. The main ideas behind LALP are to provide a higher abstraction level than HDLs, to exploit the intrinsic parallelism of hardware resources, and to allow the programmer to control execution stages whenever the compiler techniques are unable to generate efficient implementations. Those features are particularly useful to implement loop pipelining, a well regarded technique used to accelerate computations in several application domains. This paper describes LALP, and shows how it can be used to achieve high-performance computing solutions.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Originally presented as the author's thesis (M.S.), University of Illinois at Urbana-Champaign.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The Wikipedia has become the most popular online source of encyclopedic information. The English Wikipedia collection, as well as some other languages collections, is extensively linked. However, as a multilingual collection the Wikipedia is only very weakly linked. There are few cross-language links or cross-dialect links (see, for example, Chinese dialects). In order to link the multilingual-Wikipedia as a single collection, automated cross language link discovery systems are needed – systems that identify anchor-texts in one language and targets in another. The evaluation of Link Discovery approaches within the English version of the Wikipedia has been examined in the INEX Link the-Wiki track since 2007, whilst both CLEF and NTCIR emphasized the investigation and the evaluation of cross-language information retrieval. In this position paper we propose a new virtual evaluation track: Cross Language Link Discovery (CLLD). The track will initially examine cross language linking of Wikipedia articles. This virtual track will not be tied to any one forum; instead we hope it can be connected to each of (at least): CLEF, NTCIR, and INEX as it will cover ground currently studied by each. The aim is to establish a virtual evaluation environment supporting continuous assessment and evaluation, and a forum for the exchange of research ideas. It will be free from the difficulties of scheduling and synchronizing groups of collaborating researchers and alleviate the necessity to travel across the globe in order to share knowledge. We aim to electronically publish peer-reviewed publications arising from CLLD in a similar fashion: online, with open access, and without fixed submission deadlines.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background/Aim There is a 70% higher age-adjusted incidence of heart failure (HF) amongst Aboriginal and Torres Strait Islander people, three times more hospitalisations and twice as many deaths than non-Aboriginal people. There is a need to develop holistic yet individualised approaches in accord with the values of Aboriginal community healthcare to support patient education and self-care. The aim of this study was to re-design an existing HF educational resource (Fluid Watchers-Pacific Rim©) to be culturally safe for Aboriginal and Torres Strait Islander peoples, working in collaboration with the local community, and to conduct feasibility testing. Methods This study was conducted in two phases and utilised a mixed methods approach (qualitative and quantitative). Phase 1 of this study used action research methods to develop a culturally safe electronic resource to be provided to Aboriginal HF patients via a tablet computer. A HF expert panel adapted the existing resource to ensure it was evidence-based and contained appropriate language and images that reflects Aboriginal culture. A stakeholder group (which included Aboriginal workers and HF patients, as well as researchers and clinicians) then reviewed the resources and changes were made accordingly. In Phase 2, the new resource was tested on a sample of Aboriginal HF patients to assess feasibility and acceptability. Patient knowledge, satisfaction and self-care behaviours were measured using a before and after design with validated questionnaires. As this was a pilot test to determine feasibility, no statistical comparisons were made. Results - Phase 1: Throughout the process of resource development, two main themes emerged from the stakeholder consultation. These were the importance of identity, meaning that it was important to ensure that the resource accurately reflected the local community, with the appropriate clothing, skin tone and voice. The resource was adapted to reflect this and of the local community voiced the recordings for the resource. The other theme was comprehension; images were important and all text was converted to the first person and used plain language. - Phase 2: Five Aboriginal participants, mean age 61.6 ± 10.0 years, with NYHA Class III and IV heart failure were enrolled. Participants reported a high level of satisfaction with the resource (83.0%). HF knowledge (percentage of correct responses) increased from 48.0 ± 6.7% to 58.0 ± 9.7%, a 20.8% increase and results of the self-care index indicated that the biggest change was in patient confidence for self-care with a 95% increase in confidence score (46.7 ± 16.0 to 91.1 ± 11.5). Changes in management and maintenance scores varied between9275 patients. Conclusion By working in collaboration with HF experts, Aboriginal researchers and patients, a culturally safe HF resource has been developed for Aboriginal and Torres Strait Islander patients. Engaging Aboriginal researchers, capacity-building, and being responsive to local systems and structures enabled this pilot study to be successfully completed with the Aboriginal community and positive participant feedback demonstrated that the methodology used in this study was appropriate and acceptable; participants were able to engage with willingness and confidence.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Many novel computer architectures like array and multiprocessors which achieve high performance through the use of concurrency exploit variations of the von Neumann model of computation. The effective utilization of the machines makes special demands on programmers and their programming languages, such as the structuring of data into vectors or the partitioning of programs into concurrent processes. In comparison, the data flow model of computation demands only that the principle of structured programming be followed. A data flow program, often represented as a data flow graph, is a program that expresses a computation by indicating the data dependencies among operators. A data flow computer is a machine designed to take advantage of concurrency in data flow graphs by executing data independent operations in parallel. In this paper, we discuss the design of a high level language (DFL: Data Flow Language) suitable for data flow computers. Some sample procedures in DFL are presented. The implementation aspects have not been discussed in detail since there are no new problems encountered. The language DFL embodies the concepts of functional programming, but in appearance closely resembles Pascal. The language is a better vehicle than the data flow graph for expressing a parallel algorithm. The compiler has been implemented on a DEC 1090 system in Pascal.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background A cancer diagnosis elicits greater distress than any other medical diagnosis, and yet very few studies have evaluated the efficacy of structured online self-help therapeutic programs to alleviate this distress. This study aims to assess the efficacy over time of an internet Cognitive Behaviour Therapy (iCBT) intervention (‘Finding My Way’) in improving distress, coping and quality of life for individuals with a recent diagnosis of early stage cancer of any type. Methods/Design The study is a multi-site Randomised Controlled Trial (RCT) seeking to enrol 188 participants who will be randomised to either the Finding My Way Intervention or an attention-control condition. Both conditions are delivered online; with 6 modules released once per week, and an additional booster module released one month after program-completion. Participants complete online questionnaires on 4 occasions: at baseline (immediately prior to accessing the modules); post-treatment (immediately after program-completion); then three and six months later. Primary outcomes are general distress and cancer-specific distress, with secondary outcomes including Health-Related Quality of Life (HRQoL), coping, health service utilisation, intervention adherence, and user satisfaction. A range of baseline measures will be assessed as potential moderators of outcomes. Eligible participants are individuals recently diagnosed with any type of cancer, being treated with curative intent, aged over 18 years with sufficient English language literacy, internet access and an active email account and phone number. Participants are blinded to treatment group allocation. Randomisation is computer generated and stratified by gender. Discussion Compared to the few prior published studies, Finding My Way will be the first adequately powered trial to offer an iCBT intervention to curatively treated patients of heterogeneous cancer types in the immediate post-diagnosis/treatment period. If found efficacious, Finding My Way will assist with overcoming common barriers to face-to-face therapy in a cost-effective and accessible way, thus helping to reduce distress after cancer diagnosis and consequently decrease the cancer burden for individuals and the health system. Trial registration Australian New Zealand Clinical Trials Registry ACTRN12613000001​796 16.10.13

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The method of structured programming or program development using a top-down, stepwise refinement technique provides a systematic approach for the development of programs of considerable complexity. The aim of this paper is to present the philosophy of structured programming through a case study of a nonnumeric programming task. The problem of converting a well-formed formula in first-order logic into prenex normal form is considered. The program has been coded in the programming language PASCAL and implemented on a DEC-10 system. The program has about 500 lines of code and comprises 11 procedures.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Accurate supersymmetric spectra are required to confront data from direct and indirect searches of supersymmetry. SuSeFLAV is a numerical tool capable of computing supersymmetric spectra precisely for various supersymmetric breaking scenarios applicable even in the presence of flavor violation. The program solves MSSM RGEs with complete 3 x 3 flavor mixing at 2-loop level and one loop finite threshold corrections to all MSSM parameters by incorporating radiative electroweak symmetry breaking conditions. The program also incorporates the Type-I seesaw mechanism with three massive right handed neutrinos at user defined mass scales and mixing. It also computes branching ratios of flavor violating processes such as l(j) -> l(i)gamma, l(j) -> 3 l(i), b -> s gamma and supersymmetric contributions to flavor conserving quantities such as (g(mu) - 2). A large choice of executables suitable for various operations of the program are provided. Program summary Program title: SuSeFLAV Catalogue identifier: AEOD_v1_0 Program summary URL: http://cpc.cs.qub.ac.uk/summaries/AEOD_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License No. of lines in distributed program, including test data, etc.: 76552 No. of bytes in distributed program, including test data, etc.: 582787 Distribution format: tar.gz Programming language: Fortran 95. Computer: Personal Computer, Work-Station. Operating system: Linux, Unix. Classification: 11.6. Nature of problem: Determination of masses and mixing of supersymmetric particles within the context of MSSM with conserved R-parity with and without the presence of Type-I seesaw. Inter-generational mixing is considered while calculating the mass spectrum. Supersymmetry breaking parameters are taken as inputs at a high scale specified by the mechanism of supersymmetry breaking. RG equations including full inter-generational mixing are then used to evolve these parameters up to the electroweak breaking scale. The low energy supersymmetric spectrum is calculated at the scale where successful radiative electroweak symmetry breaking occurs. At weak scale standard model fermion masses, gauge couplings are determined including the supersymmetric radiative corrections. Once the spectrum is computed, the program proceeds to various lepton flavor violating observables (e.g., BR(mu -> e gamma), BR(tau -> mu gamma) etc.) at the weak scale. Solution method: Two loop RGEs with full 3 x 3 flavor mixing for all supersymmetry breaking parameters are used to compute the low energy supersymmetric mass spectrum. An adaptive step size Runge-Kutta method is used to solve the RGEs numerically between the high scale and the electroweak breaking scale. Iterative procedure is employed to get the consistent radiative electroweak symmetry breaking condition. The masses of the supersymmetric particles are computed at 1-loop order. The third generation SM particles and the gauge couplings are evaluated at the 1-loop order including supersymmetric corrections. A further iteration of the full program is employed such that the SM masses and couplings are consistent with the supersymmetric particle spectrum. Additional comments: Several executables are presented for the user. Running time: 0.2 s on a Intel(R) Core(TM) i5 CPU 650 with 3.20 GHz. (c) 2012 Elsevier B.V. All rights reserved.