993 resultados para digital delay-line interpolation


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Implantada no Brasil na década de 50, a TV a mídia de maior abrangência e poder ideológico entre seus públicos encara sua segunda grande transformação: ela deixa de ser analógica e passa a ser digital. Com isso, traz à tona novas possibilidades de recepção e a possível convergência de meios. Nesse contexto, o objetivo deste estudo foi analisar o processo de instalação do Sistema Brasileiro de Televisão Digital Terrestre (SBTVD-t) a partir do clipping on-line do Fórum Nacional pela Democratização da Comunicação (FNDC). A análise desse material teve como foco averiguar se o FNDC foi tendencioso ou não na veiculação de matérias voltadas aos aspectos técnicos da nova tecnologia, em detrimento de seu potencial social. Para tanto, optou-se por uma pesquisa de base quantitativa em que as informações e os dados coletados levaram à constatação de que o FNDC se mostra pouco eficaz como aparato crítico-apreciativo da grande mídia, além de não cumprir alguns de seus objetivos ao reproduzir discursos e ideologias de outros veículos. Da mesma forma, verificou-se ainda que o Governo Federal também fugiu aos objetivos listados nos decretos presidenciais que instituem e dispõem sobre a implantação do SBTVD.(AU)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is concerned with the study of a non-sequential identification technique, so that it may be applied to the identification of process plant mathematical models from process measurements with the greatest degree of accuracy and reliability. In order to study the accuracy of the technique under differing conditions, simple mathematical models were set up on a parallel hybrid. computer and these models identified from input/output measurements by a small on-line digital computer. Initially, the simulated models were identified on-line. However, this method of operation was found not suitable for a thorough study of the technique due to equipment limitations. Further analysis was carried out in a large off-line computer using data generated by the small on-line computer. Hence identification was not strictly on-line. Results of the work have shovm that the identification technique may be successfully applied in practice. An optimum sampling period is suggested, together with noise level limitations for maximum accuracy. A description of a double-effect evaporator is included in this thesis. It is proposed that the next stage in the work will be the identification of a mathematical model of this evaporator using the teclmique described.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is a growing demand for data transmission over digital networks involving mobile terminals. An important class of data required for transmission over mobile terminals is image information such as street maps, floor plans and identikit images. This sort of transmission is of particular interest to the service industries such as the Police force, Fire brigade, medical services and other services. These services cannot be applied directly to mobile terminals because of the limited capacity of the mobile channels and the transmission errors caused by the multipath (Rayleigh) fading. In this research, transmission of line diagram images such as floor plans and street maps, over digital networks involving mobile terminals at transmission rates of 2400 bits/s and 4800 bits/s have been studied. A low bit-rate source encoding technique using geometric codes is found to be suitable to represent line diagram images. In geometric encoding, the amount of data required to represent or store the line diagram images is proportional to the image detail. Thus a simple line diagram image would require a small amount of data. To study the effect of transmission errors due to mobile channels on the transmitted images, error sources (error files), which represent mobile channels under different conditions, have been produced using channel modelling techniques. Satisfactory models of the mobile channel have been obtained when compared to the field test measurements. Subjective performance tests have been carried out to evaluate the quality and usefulness of the received line diagram images under various mobile channel conditions. The effect of mobile transmission errors on the quality of the received images has been determined. To improve the quality of the received images under various mobile channel conditions, forward error correcting codes (FEC) with interleaving and automatic repeat request (ARQ) schemes have been proposed. The performance of the error control codes have been evaluated under various mobile channel conditions. It has been shown that a FEC code with interleaving can be used effectively to improve the quality of the received images under normal and severe mobile channel conditions. Under normal channel conditions, similar results have been obtained when using ARQ schemes. However, under severe mobile channel conditions, the FEC code with interleaving shows better performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is numerically demonstrated, for the first time, that dispersion management and in-line nonlinear optical loop mirrors can achieve all-optical passive regeneration and distance-unlimited transmission of a soliton data stream at 40 Gbit/s over standard fibre.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As we welcome 2014 we say goodbye to 2013 and I must start with an apology to authors who have submitted papers to CLAE and seen a delay in either the review process or the hard copy publication of their proofed article. The delays were caused by a major hike in the number of submissions to the journal in 2012 that increased further in 2013. In the 12 months leading to the end of October 2011 we had 94 new paper submissions, and for the same period to the end of 2012 the journal had 116 new papers. In 2012 we were awarded an impact factor for the first time and following that the next 12 month period to the end of October 2013 saw a massive increase in submissions with 171 new manuscripts being submitted. This is nearly twice as many papers as 2 years ago and 3 times as many as when I took over as Editor-in-Chief. In addition to this the UK academics will know that 2014 is a REF year (Research Excellence Framework) where universities are judged on their research and one of the major components of this measure remains to be published papers so there is a push to publishing before the REF deadline for counting. The rejection rate at CLAE has gone up too and currently is around 50% (more than double the rejection rate when I took over as Editor-in-Chief). At CLAE the number of pages that we publish each year has remained the same since 2007. When compiling issue 1 for 2014 I chose the papers to be included from the papers that were proofed and ready to go and there were around 200 proofed pages ready, which is enough to fill 3½ issues! At present Elsevier and the BCLA are preparing to increase the number the pages published per issue so that we can clear some of this backlog and remain up to date with the papers published in CLAE. I should add that on line publishing of papers is still available and there may have been review delays but there are no publishing online so authors can still get an epub on line final version of their paper with a DOI (digital object identifier) number enabling the paper to be cited. There are two awards that were made in 2013 that I would like to make special mention of. One was for my good friend Jan Bergmanson, who was awarded an honorary life fellowship of the College of Optometrists. Jan has served on the editorial board of CLAE for many years and in 2013 also celebrated 30 years of his annual ‘Texan Corneal and contact lens meeting’. The other award I wish to mention is Judith Morris, who was the BCLA Gold Medal Award winner in 2013. Judith has had many roles in her career and worked at Moorfields Eye Hospital, the Institute of Optometry and currently at City University. She has been the Europe Middle East and Africa President of IACLE (International Association of Contact Lens Educators) for many years and I think I am correct in saying that Judith is the only person who was President of both the BCLA (1983) and a few years later she was the President College of Optometrists (1989). Judith was also instrumental in introducing Vivien Freeman to the BCLA as they had been friends and Judith suggested that Vivien apply for an administrative job at the BCLA. Fast forward 29 years and in December 2013 Vivien stepped down as Secretary General of the BCLA. I would like to offer my own personal thanks to Vivien for her support of CLAE and of me over the years. The BCLA will not be the same and I wish you well in your future plans. But 2014 brings in a new position to the BCLA – Cheryl Donnelly has been given the new role of Chief Executive Officer. Cheryl was President of the BCLA in 2000 and has previously served on council. I look forward to working with Cheryl and envisage a bright future for the BCLA and CLAE. In this issue we have some great papers including some from authors who have not published with CLAE before. There is a nice paper on contact lens compliance in Nepal which brings home some familiar messages from an emerging market. A paper on how corneal curvature is affected by the use of hydrogel lenses is useful when advising patients how long they should leave their contact lenses out for to avoid seeing changes in refraction or curvature. This is useful information when refracting these patients or pre-laser surgery. There is a useful paper offering tips on fitting bitoric gas permeable lenses post corneal graft and a paper detailing surgery to implant piggyback multifocal intraocular lenses. One fact that I noted from the selection of papers in the current issue is where they were from. In this issue none of the corresponding authors are from the United Kingdom. There are two papers each from the United States, Spain and Iran, and one each from the Netherlands, Ireland, Republic of Korea, Australia and Hong Kong. This is an obvious reflection of the widening interest in CLAE and the BCLA and indicates the new research groups emerging in the field.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nowadays the great public libraries in Bulgaria are gaining the appearance of digital centers which provide new informational resources and services in the digital space. The digital conversion as a way of preservation is one of the important priorities of Regional Public Library in Veliko Tarnovo. In the last few years we persistently search for possible ways of financing by national and foreign programs in this direction. In the beginning the strategy was oriented to digitalization of the funds with most urgent conversion – these of the local studies periodicals from 1878 till 1944 year. The digitalization of funds will create conditions of laying the basement of full text database of Bulgarian periodical publications. The technology that is offered gives opportunities for including other libraries in the Unified index, which can develop it into a National Unified index of periodical publications. The integrated informational environment that is created is an attractive, comfortable and useful place for work in home or at work for researchers, historians, art experts, bibliographers. The library readers use very actively all informational services of the library internet page and work competently with the on-line indexes provided there, they find the necessary title, which can be demanded later for usage in home or in the library, using electronic means again.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Limited literature regarding parameter estimation of dynamic systems has been identified as the central-most reason for not having parametric bounds in chaotic time series. However, literature suggests that a chaotic system displays a sensitive dependence on initial conditions, and our study reveals that the behavior of chaotic system: is also sensitive to changes in parameter values. Therefore, parameter estimation technique could make it possible to establish parametric bounds on a nonlinear dynamic system underlying a given time series, which in turn can improve predictability. By extracting the relationship between parametric bounds and predictability, we implemented chaos-based models for improving prediction in time series. ^ This study describes work done to establish bounds on a set of unknown parameters. Our research results reveal that by establishing parametric bounds, it is possible to improve the predictability of any time series, although the dynamics or the mathematical model of that series is not known apriori. In our attempt to improve the predictability of various time series, we have established the bounds for a set of unknown parameters. These are: (i) the embedding dimension to unfold a set of observation in the phase space, (ii) the time delay to use for a series, (iii) the number of neighborhood points to use for avoiding detection of false neighborhood and, (iv) the local polynomial to build numerical interpolation functions from one region to another. Using these bounds, we are able to get better predictability in chaotic time series than previously reported. In addition, the developments of this dissertation can establish a theoretical framework to investigate predictability in time series from the system-dynamics point of view. ^ In closing, our procedure significantly reduces the computer resource usage, as the search method is refined and efficient. Finally, the uniqueness of our method lies in its ability to extract chaotic dynamics inherent in non-linear time series by observing its values. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As traffic congestion continues to worsen in large urban areas, solutions are urgently sought. However, transportation planning models, which estimate traffic volumes on transportation network links, are often unable to realistically consider travel time delays at intersections. Introducing signal controls in models often result in significant and unstable changes in network attributes, which, in turn, leads to instability of models. Ignoring the effect of delays at intersections makes the model output inaccurate and unable to predict travel time. To represent traffic conditions in a network more accurately, planning models should be capable of arriving at a network solution based on travel costs that are consistent with the intersection delays due to signal controls. This research attempts to achieve this goal by optimizing signal controls and estimating intersection delays accordingly, which are then used in traffic assignment. Simultaneous optimization of traffic routing and signal controls has not been accomplished in real-world applications of traffic assignment. To this end, a delay model dealing with five major types of intersections has been developed using artificial neural networks (ANNs). An ANN architecture consists of interconnecting artificial neurons. The architecture may either be used to gain an understanding of biological neural networks, or for solving artificial intelligence problems without necessarily creating a model of a real biological system. The ANN delay model has been trained using extensive simulations based on TRANSYT-7F signal optimizations. The delay estimates by the ANN delay model have percentage root-mean-squared errors (%RMSE) that are less than 25.6%, which is satisfactory for planning purposes. Larger prediction errors are typically associated with severely oversaturated conditions. A combined system has also been developed that includes the artificial neural network (ANN) delay estimating model and a user-equilibrium (UE) traffic assignment model. The combined system employs the Frank-Wolfe method to achieve a convergent solution. Because the ANN delay model provides no derivatives of the delay function, a Mesh Adaptive Direct Search (MADS) method is applied to assist in and expedite the iterative process of the Frank-Wolfe method. The performance of the combined system confirms that the convergence of the solution is achieved, although the global optimum may not be guaranteed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traffic incidents are a major source of traffic congestion on freeways. Freeway traffic diversion using pre-planned alternate routes has been used as a strategy to reduce traffic delays due to major traffic incidents. However, it is not always beneficial to divert traffic when an incident occurs. Route diversion may adversely impact traffic on the alternate routes and may not result in an overall benefit. This dissertation research attempts to apply Artificial Neural Network (ANN) and Support Vector Regression (SVR) techniques to predict the percent of delay reduction from route diversion to help determine whether traffic should be diverted under given conditions. The DYNASMART-P mesoscopic traffic simulation model was applied to generate simulated data that were used to develop the ANN and SVR models. A sample network that comes with the DYNASMART-P package was used as the base simulation network. A combination of different levels of incident duration, capacity lost, percent of drivers diverted, VMS (variable message sign) messaging duration, and network congestion was simulated to represent different incident scenarios. The resulting percent of delay reduction, average speed, and queue length from each scenario were extracted from the simulation output. The ANN and SVR models were then calibrated for percent of delay reduction as a function of all of the simulated input and output variables. The results show that both the calibrated ANN and SVR models, when applied to the same location used to generate the calibration data, were able to predict delay reduction with a relatively high accuracy in terms of mean square error (MSE) and regression correlation. It was also found that the performance of the ANN model was superior to that of the SVR model. Likewise, when the models were applied to a new location, only the ANN model could produce comparatively good delay reduction predictions under high network congestion level.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation explores the role of artillery forward observation teams during the battle of Okinawa (April–June 1945). It addresses a variety of questions associated with this front line artillery support. First, it examines the role of artillery itself in the American victory over the Japanese on Okinawa. Second, it traces the history of the forward observer in the three decades before the end of World War II. Third, it defines the specific role of the forward observation teams during the battle: what they did and how they did it during this three-month duel. Fourth, it deals with the particular problems of the forward observer. These included coordination with the local infantry commander, adjusting to the periodic rotation between the front lines and the artillery battery behind the line of battle, responding to occasional problems with "friendly fire" (American artillery falling on American ground forces), dealing with personnel turnover in the teams (due to death, wounds, and illness), and finally, developing a more informal relationship between officers and enlisted men to accommodate the reality of this recently created combat assignment. Fifth, it explores the experiences of a select group of men who served on (or in proximity to) forward observation teams on Okinawa. Previous scholars and popular historians of the battle have emphasized the role of Marines, infantrymen, and flame-throwing armor. This work offers a different perspective on the battle and it uses new sources as well. A pre-existing archive of interviews with Okinawan campaign forward observer team members conducted in the 1990s forms the core of the oral history component of this research project. The verbal accounts were checked against and supplemented by a review of unit reports obtained from the U.S. National Archives and various secondary sources. The dissertation concludes that an understanding of American artillery observation is critical to a more complete comprehension of the battle of Okinawa. These mid-ranking (and largely middle class) soldiers proved capable of adjusting to the demands of combat conditions. They provide a unique and understudied perspective of the entire battle.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research is motivated by a practical application observed at a printed circuit board (PCB) manufacturing facility. After assembly, the PCBs (or jobs) are tested in environmental stress screening (ESS) chambers (or batch processing machines) to detect early failures. Several PCBs can be simultaneously tested as long as the total size of all the PCBs in the batch does not violate the chamber capacity. PCBs from different production lines arrive dynamically to a queue in front of a set of identical ESS chambers, where they are grouped into batches for testing. Each line delivers PCBs that vary in size and require different testing (or processing) times. Once a batch is formed, its processing time is the longest processing time among the PCBs in the batch, and its ready time is given by the PCB arriving last to the batch. ESS chambers are expensive and a bottleneck. Consequently, its makespan has to be minimized. ^ A mixed-integer formulation is proposed for the problem under study and compared to a formulation recently published. The proposed formulation is better in terms of the number of decision variables, linear constraints and run time. A procedure to compute the lower bound is proposed. For sparse problems (i.e. when job ready times are dispersed widely), the lower bounds are close to optimum. ^ The problem under study is NP-hard. Consequently, five heuristics, two metaheuristics (i.e. simulated annealing (SA) and greedy randomized adaptive search procedure (GRASP)), and a decomposition approach (i.e. column generation) are proposed—especially to solve problem instances which require prohibitively long run times when a commercial solver is used. Extensive experimental study was conducted to evaluate the different solution approaches based on the solution quality and run time. ^ The decomposition approach improved the lower bounds (or linear relaxation solution) of the mixed-integer formulation. At least one of the proposed heuristic outperforms the Modified Delay heuristic from the literature. For sparse problems, almost all the heuristics report a solution close to optimum. GRASP outperforms SA at a higher computational cost. The proposed approaches are viable to implement as the run time is very short. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation introduces a new system for handwritten text recognition based on an improved neural network design. Most of the existing neural networks treat mean square error function as the standard error function. The system as proposed in this dissertation utilizes the mean quartic error function, where the third and fourth derivatives are non-zero. Consequently, many improvements on the training methods were achieved. The training results are carefully assessed before and after the update. To evaluate the performance of a training system, there are three essential factors to be considered, and they are from high to low importance priority: (1) error rate on testing set, (2) processing time needed to recognize a segmented character and (3) the total training time and subsequently the total testing time. It is observed that bounded training methods accelerate the training process, while semi-third order training methods, next-minimal training methods, and preprocessing operations reduce the error rate on the testing set. Empirical observations suggest that two combinations of training methods are needed for different case character recognition. Since character segmentation is required for word and sentence recognition, this dissertation provides also an effective rule-based segmentation method, which is different from the conventional adaptive segmentation methods. Dictionary-based correction is utilized to correct mistakes resulting from the recognition and segmentation phases. The integration of the segmentation methods with the handwritten character recognition algorithm yielded an accuracy of 92% for lower case characters and 97% for upper case characters. In the testing phase, the database consists of 20,000 handwritten characters, with 10,000 for each case. The testing phase on the recognition 10,000 handwritten characters required 8.5 seconds in processing time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Just about everyone who ranks cruise lines puts Seabourn first on the list. The readers of Conde Nast Traveler ranked it the world's top cruise line for three consecutive years and fifth in their survey of the top 100 overall travel experiences. Of special interest to hospitality professionals is Seabourn's 98.5 percent score for service- higher than any other vacation experience in the world.