882 resultados para NETTRA-E2-FIFO (Computer program)


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The quality of an online university degree is paramount to the student, the reputation of the university and most importantly, the profession that will be entered. At the School of Education within Curtin University, we aim to ensure that students within rural and remote areas are provided with high quality degrees equal to their city counterparts who access face-to-face classes on campus.In 2010, the School of Education moved to flexible delivery of a fully online Bachelor of Education degree for their rural students. In previous years, the degree had been delivered in physical locations around the state. Although this served the purpose for the time, it restricted the degree to only those rural students who were able to access the physical campus. The new model in 2010 allows access for students in any rural area who have a computer and an internet connection, regardless of their geographical location. As a result enrolments have seen a positive increase in new students. Academic staff had previously used an asynchronous environment to deliver learning modules housed within a learning management system (LMS). To enhance the learning environment and to provide high quality learning experiences to students learning at a distance, the adoption of synchronous software was introduced. This software is a real-time virtual classroom environment that allows for communication through Voice over Internet Protocol (VoIP) and videoconferencing, along with a large number of collaboration tools to engage learners. This research paper reports on the professional development of academic staff to integrate a live e-learning solution into their current LMS environment. It involved professional development, including technical orientation for teaching staff and course participants simultaneously. Further, pedagogical innovations were offered to engage the students in a collaborative learning environment. Data were collected from academic staff through semi-structured interviews and participant observation. The findings discuss the perceived value of the technology, problems encountered and solutions sought.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Child maltreatment has severe short-and long-term consequences for children’s health, development, and wellbeing. Despite the provision of child protection education programs in many countries, few have been rigorously evaluated to determine their effectiveness. We describe the design of a multi-site gold standard evaluation of an Australian school-based child protection education program. The intervention has been developed by a not-for-profit agency and comprises 5 1-h sessions delivered to first grade students (aged 5–6 years) in their regular classrooms. It incorporates common attributes of effective programs identified in the literature, and aligns with the Australian education curriculum. Methods/Design A three-site cluster randomised controlled trial (RCT) of Learn to be safe with Emmy and friends™ will be conducted with children in approximately 72 first grade classrooms in 24 Queensland primary (elementary) schools from three state regions, over a period of 2 years. Entire schools will be randomised, using a computer generated list of random numbers, to intervention and wait-list control conditions, to prevent contamination effects across students and classes. Data will be collected at baseline (pre-assessment), immediately after the intervention (post-assessment), and at 6-, 12-, and 18-months (follow-up assessments). Outcome assessors will be blinded to group membership. Primary outcomes assessed are children’s knowledge of program concepts; intentions to use program knowledge, skills, and help-seeking strategies; actual use of program material in a simulated situation; and anxiety arising from program participation. Secondary outcomes include a parent discussion monitor, parent observations of their children’s use of program materials, satisfaction with the program, and parental stress. A process evaluation will be conducted concurrently to assess program performance. Discussion This RCT addresses shortcomings in previous studies and methodologically extends research in this area by randomising at school-level to prevent cross-learning between conditions; providing longer-term outcome assessment than any previous study; examining the degree to which parents/guardians discuss intervention content with children at home; assessing potential moderating/mediating effects of family and child demographic variables; testing an in-vivo measure to assess children’s ability to discriminate safe/unsafe situations and disclose to trusted adults; and testing enhancements to existing measures to establish greater internal consistency.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Checkpoint-1 kinase plays an important role in the G(2)M cell cycle control, therefore its inhibition by small molecules is of great therapeutic interest in oncology. In this paper, we have reported the virtual screening of an in-house library of 2499 pyranopyrazole derivatives against the ATP-binding site of Chk1 kinase using Glide 5.0 program, which resulted in six hits. All these ligands were docked into the site forming most crucial interactions with Cys87, Glu91 and Leu15 residues. From the observed results these ligands are suggested to be potent inhibitors of Chk1 kinase with sufficient scope for further elaboration.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The compounds CdHgTe and its constituent binaries CdTe, HgTe, and CdHg are semiconductors which are used in thermal, infrared, nuclear, thermoelectric and other photo sensitive devices. The compound CdHgTe has a Sphaleritic structure of possible type A1IIB1IIC6VI. The TERCP program of Kaufman is used to estimate the stable regions of the ternary phase diagram using available thermodynamic data. It was found that there was little variation in stochiometry with temperature. The compositions were calculated for temperatures ranging from 325K to 100K and the compositional limits were Cd13−20Hg12−01Te75−79, Hg varying most. By comparison with a similar compound, Cd In2Te4 of forbidden band width. 88 to .90 e.V., similar properties are postulated for Cd1Hg1Te6 with applications in the infra red region of the spectrum at 300K where this composition is given by TERCP at the limit of stability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The method of structured programming or program development using a top-down, stepwise refinement technique provides a systematic approach for the development of programs of considerable complexity. The aim of this paper is to present the philosophy of structured programming through a case study of a nonnumeric programming task. The problem of converting a well-formed formula in first-order logic into prenex normal form is considered. The program has been coded in the programming language PASCAL and implemented on a DEC-10 system. The program has about 500 lines of code and comprises 11 procedures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Knowledge about program worst case execution time (WCET) is essential in validating real-time systems and helps in effective scheduling. One popular approach used in industry is to measure execution time of program components on the target architecture and combine them using static analysis of the program. Measurements need to be taken in the least intrusive way in order to avoid affecting accuracy of estimated WCET. Several programs exhibit phase behavior, wherein program dynamic execution is observed to be composed of phases. Each phase being distinct from the other, exhibits homogeneous behavior with respect to cycles per instruction (CPI), data cache misses etc. In this paper, we show that phase behavior has important implications on timing analysis. We make use of the homogeneity of a phase to reduce instrumentation overhead at the same time ensuring that accuracy of WCET is not largely affected. We propose a model for estimating WCET using static worst case instruction counts of individual phases and a function of measured average CPI. We describe a WCET analyzer built on this model which targets two different architectures. The WCET analyzer is observed to give safe estimates for most benchmarks considered in this paper. The tightness of the WCET estimates are observed to be improved for most benchmarks compared to Chronos, a well known static WCET analyzer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Accurate supersymmetric spectra are required to confront data from direct and indirect searches of supersymmetry. SuSeFLAV is a numerical tool capable of computing supersymmetric spectra precisely for various supersymmetric breaking scenarios applicable even in the presence of flavor violation. The program solves MSSM RGEs with complete 3 x 3 flavor mixing at 2-loop level and one loop finite threshold corrections to all MSSM parameters by incorporating radiative electroweak symmetry breaking conditions. The program also incorporates the Type-I seesaw mechanism with three massive right handed neutrinos at user defined mass scales and mixing. It also computes branching ratios of flavor violating processes such as l(j) -> l(i)gamma, l(j) -> 3 l(i), b -> s gamma and supersymmetric contributions to flavor conserving quantities such as (g(mu) - 2). A large choice of executables suitable for various operations of the program are provided. Program summary Program title: SuSeFLAV Catalogue identifier: AEOD_v1_0 Program summary URL: http://cpc.cs.qub.ac.uk/summaries/AEOD_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License No. of lines in distributed program, including test data, etc.: 76552 No. of bytes in distributed program, including test data, etc.: 582787 Distribution format: tar.gz Programming language: Fortran 95. Computer: Personal Computer, Work-Station. Operating system: Linux, Unix. Classification: 11.6. Nature of problem: Determination of masses and mixing of supersymmetric particles within the context of MSSM with conserved R-parity with and without the presence of Type-I seesaw. Inter-generational mixing is considered while calculating the mass spectrum. Supersymmetry breaking parameters are taken as inputs at a high scale specified by the mechanism of supersymmetry breaking. RG equations including full inter-generational mixing are then used to evolve these parameters up to the electroweak breaking scale. The low energy supersymmetric spectrum is calculated at the scale where successful radiative electroweak symmetry breaking occurs. At weak scale standard model fermion masses, gauge couplings are determined including the supersymmetric radiative corrections. Once the spectrum is computed, the program proceeds to various lepton flavor violating observables (e.g., BR(mu -> e gamma), BR(tau -> mu gamma) etc.) at the weak scale. Solution method: Two loop RGEs with full 3 x 3 flavor mixing for all supersymmetry breaking parameters are used to compute the low energy supersymmetric mass spectrum. An adaptive step size Runge-Kutta method is used to solve the RGEs numerically between the high scale and the electroweak breaking scale. Iterative procedure is employed to get the consistent radiative electroweak symmetry breaking condition. The masses of the supersymmetric particles are computed at 1-loop order. The third generation SM particles and the gauge couplings are evaluated at the 1-loop order including supersymmetric corrections. A further iteration of the full program is employed such that the SM masses and couplings are consistent with the supersymmetric particle spectrum. Additional comments: Several executables are presented for the user. Running time: 0.2 s on a Intel(R) Core(TM) i5 CPU 650 with 3.20 GHz. (c) 2012 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Biscayne Bay Benthic Sampling Program was divided into two phases. In Phase I, sixty sampling stations were established in Biscayne Bay (including Dumfoundling Bay and Card Sound) representing diverse habitats. The stations were visited in the wet season (late fall of 1981) and in the dry season (midwinter of 1982). At each station certain abiotic conditions were measured or estimated. These included depth, sources of freshwater inflow and pollution, bottom characteristics, current direction and speed, surface and bottom temperature, salinity and dissolved oxygen, and water clarity was estimated with a secchi disk. Seagrass blades and macroalgae were counted in a 0.1-m2 grid placed so as to best represent the bottom community within a 50-foot radius. Underwater 35-mm photographs were made of the bottom using flash apparatus. Benthic samples were collected using a petite Ponar dredge. These samples were washed through a 5-mm mesh screen, fixed in formalin in the field, and later sorted and identified by experts to a pre-agreed taxonomic level. During the wet season sampling period, a nonquantitative one-meter wide trawl was made of the epibenthic community. These samples were also washed, fixed, sorted and identified. During the dry season sampling period, sediment cores were collected at each station not located on bare rock. These cores were analyzed for sediment size and organic composition by personnel of the University of Miami. Data resulting from the sampling were entered into a computer. These data were subjected to cluster analyses, Shannon-Weaver diversity analysis, multiple regression analysis of variance and covariance, and factor analysis. In Phase II of the program, fifteen stations were selected from among the sixty of Phase I. These stations were sampled quarterly. At each quarter, five Petite Ponar dredge samples were collected from each station. As in Phase I, observations and measurements, including seagrass blade counts, were made at each station. In Phase II, polychaete specimens collected were given to a separate contractor for analysis to the species level. These analyses included mean, standard deviation, coefficient of dispersion, percent of total, and numeric rank for each organism in each station as well as number of species, Shannon-Weaver taxa diversity, and dominance (the compliment of Simpson's Index) for each station. Multiple regression analysis of variance and covariance, and factor analysis were applied to the data to determine effect of abiotic factors measured at each station. (PDF contains 96 pages)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Association for Computing Machinery, ACM; IEEE; IEEE Computer Society; SIGSOFT

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We first pose the following problem: to develop a program which takes line-drawings as input and constructs three-dimensional objects as output, such that the output objects are the same as the ones we see when we look at the input line-drawing. We then introduce the principle of minimum standard-deviation of angles (MSDA) and discuss a program based on MSDA. We present the results of testing this program with a variety of line- drawings and show that the program constitutes a solution to the stated problem over the range of line-drawings tested. Finally, we relate this work to its historical antecedents in the psychological and computer-vision literature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The key to understanding a program is recognizing familiar algorithmic fragments and data structures in it. Automating this recognition process will make it easier to perform many tasks which require program understanding, e.g., maintenance, modification, and debugging. This report describes a recognition system, called the Recognizer, which automatically identifies occurrences of stereotyped computational fragments and data structures in programs. The Recognizer is able to identify these familiar fragments and structures, even though they may be expressed in a wide range of syntactic forms. It does so systematically and efficiently by using a parsing technique. Two important advances have made this possible. The first is a language-independent graphical representation for programs and programming structures which canonicalizes many syntactic features of programs. The second is an efficient graph parsing algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A fundamental problem in artificial intelligence is obtaining coherent behavior in rule-based problem solving systems. A good quantitative measure of coherence is time behavior; a system that never, in retrospect, applied a rule needlessly is certainly coherent; a system suffering from combinatorial blowup is certainly behaving incoherently. This report describes a rule-based problem solving system for automatically writing and improving numerical computer programs from specifications. The specifications are in terms of "constraints" among inputs and outputs. The system has solved program synthesis problems involving systems of equations, determining that methods of successive approximation converge, transforming recursion to iteration, and manipulating power series (using differing organizations, control structures, and argument-passing techniques).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A computer may gather a lot of information from its environment in an optical or graphical manner. A scene, as seen for instance from a TV camera or a picture, can be transformed into a symbolic description of points and lines or surfaces. This thesis describes several programs, written in the language CONVERT, for the analysis of such descriptions in order to recognize, differentiate and identify desired objects or classes of objects in the scene. Examples are given in each case. Although the recognition might be in terms of projections of 2-dim and 3-dim objects, we do not deal with stereoscopic information. One of our programs (Polybrick) identifies parallelepipeds in a scene which may contain partially hidden bodies and non-parallelepipedic objects. The program TD works mainly with 2-dimensional figures, although under certain conditions successfully identifies 3-dim objects. Overlapping objects are identified when they are transparent. A third program, DT, works with 3-dim and 2-dim objects, and does not identify objects which are not completely seen. Important restrictions and suppositions are: (a) the input is assumed perfect (noiseless), and in a symbolic format; (b) no perspective deformation is considered. A portion of this thesis is devoted to the study of models (symbolic representations) of the objects we want to identify; different schemes, some of them already in use, are discussed. Focusing our attention on the more general problem of identification of general objects when they substantially overlap, we propose some schemes for their recognition, and also analyze some problems that are met.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PILOT is a programming system constructed in LISP. It is designed to facilitate the development of programs by easing the familiar sequence: write some code, run the program, make some changes, write some more code, run the program again, etc. As a program becomes more complex, making these changes becomes harder and harder because the implications of changes are harder to anticipate. In the PILOT system, the computer plays an active role in this evolutionary process by providing the means whereby changes can be effected immediately, and in ways that seem natural to the user. The user of PILOT feels that he is giving advice, or making suggestions, to the computer about the operation of his programs, and that the system then performs the work necessary. The PILOT system is thus an interface between the user and his program, monitoring both in the requests of the user and operation of his program. The user may easily modify the PILOT system itself by giving it advice about its own operation. This allows him to develop his own language and to shift gradually onto PILOT the burden of performing routine but increasingly complicated tasks. In this way, he can concentrate on the conceptual difficulties in the original problem, rather than on the niggling tasks of editing, rewriting, or adding to his programs. Two detailed examples are presented. PILOT is a first step toward computer systems that will help man to formulate problems in the same way they now help him to solve them. Experience with it supports the claim that such "symbiotic systems" allow the programmer to attack and solve more difficult problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The constraint paradigm is a model of computation in which values are deduced whenever possible, under the limitation that deductions be local in a certain sense. One may visualize a constraint 'program' as a network of devices connected by wires. Data values may flow along the wires, and computation is performed by the devices. A device computes using only locally available information (with a few exceptions), and places newly derived values on other, locally attached wires. In this way computed values are propagated. An advantage of the constraint paradigm (not unique to it) is that a single relationship can be used in more than one direction. The connections to a device are not labelled as inputs and outputs; a device will compute with whatever values are available, and produce as many new values as it can. General theorem provers are capable of such behavior, but tend to suffer from combinatorial explosion; it is not usually useful to derive all the possible consequences of a set of hypotheses. The constraint paradigm places a certain kind of limitation on the deduction process. The limitations imposed by the constraint paradigm are not the only one possible. It is argued, however, that they are restrictive enough to forestall combinatorial explosion in many interesting computational situations, yet permissive enough to allow useful computations in practical situations. Moreover, the paradigm is intuitive: It is easy to visualize the computational effects of these particular limitations, and the paradigm is a natural way of expressing programs for certain applications, in particular relationships arising in computer-aided design. A number of implementations of constraint-based programming languages are presented. A progression of ever more powerful languages is described, complete implementations are presented and design difficulties and alternatives are discussed. The goal approached, though not quite reached, is a complete programming system which will implicitly support the constraint paradigm to the same extent that LISP, say, supports automatic storage management.