889 resultados para level of information
Resumo:
AbstractMaize is considered a source of carotenoids; however, these compounds are highly unstable, degraded by high temperatures, exposure to light and presence of oxygen. The objective of this work was to evaluate the influence of the moisture and type of drying applied to grains on the level of carotenoids in yellow maize. The experiment was conducted in a completely randomized design (2 × 4 factorial), two levels of initial moisture at the harvest (22 and 19%) and three types of drying (in the sun; in the shade and in a dryer) and control (no drying). The samples of grains after drying with 12% of final moisture were analyzed by concentration of total carotenoids, carotenes (α-carotene + β-carotene), monohydroxilated carotenoids (β-cryptoxanthin), and xanthophylls (lutein + zeaxanthin). Initial moisture, type of drying and the interaction between moisture versus drying influence (p≤0.05) the levels of carotenoids in grains. This is the first report about the drying conditions and harvest’s initial moisture as influence on the profile and content of carotenoids in maize grains. Based on the results, this work suggested that the harvest be carried out preferably when the grains present 22% humidity, with drying in a dryer or in shade for further use or storage.
Resumo:
In the new age of information technology, big data has grown to be the prominent phenomena. As information technology evolves, organizations have begun to adopt big data and apply it as a tool throughout their decision-making processes. Research on big data has grown in the past years however mainly from a technical stance and there is a void in business related cases. This thesis fills the gap in the research by addressing big data challenges and failure cases. The Technology-Organization-Environment framework was applied to carry out a literature review on trends in Business Intelligence and Knowledge management information system failures. A review of extant literature was carried out using a collection of leading information system journals. Academic papers and articles on big data, Business Intelligence, Decision Support Systems, and Knowledge Management systems were studied from both failure and success aspects in order to build a model for big data failure. I continue and delineate the contribution of the Information System failure literature as it is the principal dynamics behind technology-organization-environment framework. The gathered literature was then categorised and a failure model was developed from the identified critical failure points. The failure constructs were further categorized, defined, and tabulated into a contextual diagram. The developed model and table were designed to act as comprehensive starting point and as general guidance for academics, CIOs or other system stakeholders to facilitate decision-making in big data adoption process by measuring the effect of technological, organizational, and environmental variables with perceived benefits, dissatisfaction and discontinued use.
Resumo:
In the new age of information technology, big data has grown to be the prominent phenomena. As information technology evolves, organizations have begun to adopt big data and apply it as a tool throughout their decision-making processes. Research on big data has grown in the past years however mainly from a technical stance and there is a void in business related cases. This thesis fills the gap in the research by addressing big data challenges and failure cases. The Technology-Organization-Environment framework was applied to carry out a literature review on trends in Business Intelligence and Knowledge management information system failures. A review of extant literature was carried out using a collection of leading information system journals. Academic papers and articles on big data, Business Intelligence, Decision Support Systems, and Knowledge Management systems were studied from both failure and success aspects in order to build a model for big data failure. I continue and delineate the contribution of the Information System failure literature as it is the principal dynamics behind technology-organization-environment framework. The gathered literature was then categorised and a failure model was developed from the identified critical failure points. The failure constructs were further categorized, defined, and tabulated into a contextual diagram. The developed model and table were designed to act as comprehensive starting point and as general guidance for academics, CIOs or other system stakeholders to facilitate decision-making in big data adoption process by measuring the effect of technological, organizational, and environmental variables with perceived benefits, dissatisfaction and discontinued use.
Resumo:
Global digitalization has affected also industrial sector. A trend called Industrial Internet has been present for some years and established relatively steady position in businesses. Industrial Internet is also referred with the terminology Industry 4.0 and in consumer businesses IoT (Internet of Things). Eventually, trend consists of many traditionally proven technologies and concepts, such as condition monitoring, remote services, predictive maintenance and Internet customer portals. All these technologies and information related to them are estimated to change the rules of business in industrial sector. This may result even a new industrial revolution. This research has its focus on Industrial Internet products, services and applications. The study analyses four case companies and their digital service offerings. According to this analysis the comparison of these services is done to find out if there is still space for companies to gain competitive advantage through differentiation with these state of the art solutions. One of the case companies, Case Company Ltd., is working as a primary case company and a subscriber of this particular research. The research and results are analyzed primarily from this company’s perspective and need. In empirical part, the research clarifies how Case Company Ltd. has allocated its development resources through last five years. These allocations in certain categories are then compared to other case companies’ current customer offering and conclusions are made how the approach of different companies differ from each other. Existing theoretical knowledge of Industrial Internet is about to find its shape. In this research we take a look how the case company analysis and findings correlate with the existing knowledge and literature of the topic.
Resumo:
Since the early 1970's, Canadians have expressed many concerns about the growth of government and its impact on their daily lives. The public has requested increased access to government documents and improved protection of the personal information which is held in government files and data banks. At the same time, both academics and practitioners in the field of public administration have become more interested in the values that public servants bring to their decisions and recommendations. Certain administrative values, such as accountability and integrity, have taken on greater relative importance. The purpose of this thesis is to examine the implementation of Ontario's access and privacy law. It centres on the question of whether or not the Freedom of Information and Protection of Privacy Act, 1987, (FIPPA) has answered the demand for open access to government while at the same time protecting the personal privacy of individual citizens. It also assesses the extent to which this relatively new piece of legislation has made a difference to the people of Ontario. The thesis presents an overview of the issues of freedom of information and protection of privacy in Ontario. It begins with the evolution of the legislation and a description of the law itself. It focuses on the structures and processes which have been established to meet the procedural and administrative demands of the Act. These structures and processes are evaluated in two ways. First, the thesis evaluates how open the Ontario government has become and, second, it determines how Ill carefully the privacy rights of individuals are safeguarded. An analytical framework of administrative values is used to evaluate the overall performance of the government in these two areas. The conclusion is drawn that, overall, the Ontario government has effectively implemented the Freedom of Information and Protection of Privacy Act, particularly by providing access to most government-held documents. The protection of individual privacy has proved to be not only more difficult to achieve, but more difficult to evaluate. However, the administrative culture of the Ontario bureaucracy is shown to be committed to ensuring that the access and privacy rights of citizens are respected.
Resumo:
Some Ecological Factors Affecting the Input and Population Levels of Total and Faecal Coliforms and Salmonella in Twelve Mile Creek, Lake Ontario and Sewage Waters Near St. Catharines, Ontario. Supervisor: Dr. M. Helder. The present study was undertaken to investigate the role of some ecological factors on sewage-Dorne bacteria in waters near St. Catharines, Ontario. Total and faecal coliform levels and the presence of Salmonella were monitored for a period of a year along with determination of temperature, pH, dissolved oxygen, total dissolved solids, nitrate N, total phosphate P and ammonium N. Bacteriological tests for coliform analysis were done according to APHA Standard Methods by the membrane filtration technique. The grab sampling technique was employed for all sampling. Four sample sites were chosen in the Port Dalhousie beach area to determine what bacteriological or physical relationship the sites had to each other. The sample sites chosen were the sewage inflow to and the effluent from the St. Catharines (Port Dalhousie) Pollution Control Plant, Twelve Mile Creek below the sewage outfall and Lake Ontario at the Lakeside Park beach. The sewage outfall was located in Twelve Mile Creek, approximately 80 meters from the creek junction with the beach and piers on Lake Ontario. Twelve Mile Creek normally carried a large volume of water from the WeIland Canal which was diverted through the DeCew Generating Station located on the Niagara Escarpment. An additional sample site, which was thought to be free of industrial wastes, was chosen at Twenty Mile Creek, also in the Niagara Region of Ontarioo 3 There were marked variations in bacterial numbers at each site and between each site, but trends to lower_numbers were noted from the sewage inflow to Lake Ontario. Better correlations were noted between total and faecal coliform population levels and total phosphate P and ammonium N in Twenty Mile Creek. Other correlations were observed for other sample stations, however, these results also appeared to be random in nature. Salmonella isolations occurred more frequently during the winter and spring months when water temperatures were minimal at all sample stations except the sewage inflow. The frequency of Salmonella isolations appeared to be related to increased levels of total and faecal coli forms in the sewage effluent. However, no clear relationships were established in the other sample stations. Due to the presence of Salmonella and high levels of total and faecal coliform indicator organisms, the sanitary quality of Lake Ontario and Twelve Mile Creek at the sample sites seemed to be impaired over the major portion of the study period.
Resumo:
UANL
Resumo:
La barrière hémato-encéphalique (BHE) protège le système nerveux central (SNC) en contrôlant le passage des substances sanguines et des cellules immunitaires. La BHE est formée de cellules endothéliales liées ensemble par des jonctions serrées et ses fonctions sont maintenues par des astrocytes, celles ci sécrétant un nombre de facteurs essentiels. Une analyse protéomique de radeaux lipidiques de cellules endothéliales de la BHE humaine a identifié la présence de la voie de signalisation Hedgehog (Hh), une voie souvent liées à des processus de développement embryologique ainsi qu’au niveau des tissus adultes. Suite à nos expériences, j’ai déterminé que les astrocytes produisent et secrètent le ligand Sonic Hh (Shh) et que les cellules endothéliales humaines en cultures primaires expriment le récepteur Patched (Ptch)-1, le co-récepteur Smoothened (Smo) et le facteur de transcription Gli-1. De plus, l’activation de la voie Hh augmente l’étanchéité des cellules endothéliales de la BHE in vitro. Le blocage de l’activation de la voie Hh en utilisant l’antagoniste cyclopamine ainsi qu’en utilisant des souris Shh déficientes (-/-) diminue l’expression des protéines de jonctions serrées, claudin-5, occcludin, et ZO-1. La voie de signalisation s’est aussi montrée comme étant immunomodulatoire, puisque l’activation de la voie dans les cellules endothéliales de la BHE diminue l’expression de surface des molécules d’adhésion ICAM-1 et VCAM-1, ainsi que la sécrétion des chimiokines pro-inflammatoires IL-8/CXCL8 et MCP-1/CCL2, créant une diminution de la migration des lymphocytes CD4+ à travers une monocouche de cellules endothéliales de la BHE. Des traitements avec des cytokines pro-inflammatoires TNF-α and IFN-γ in vitro, augmente la production de Shh par les astrocytes ainsi que l’expression de surface de Ptch-1 et de Smo. Dans des lésions actives de la sclérose en plaques (SEP), où la BHE est plus perméable, les astrocytes hypertrophiques augmentent leur expression de Shh. Par contre, les cellules endothéliales de la BHE n’augmentent pas leur expression de Ptch-1 ou Smo, suggérant une dysfonction dans la voie de signalisation Hh. Ces résultats montrent que la voie de signalisation Hh promeut les propriétés de la BHE, et qu’un environnement d’inflammation pourrait potentiellement dérégler la BHE en affectant la voie de signalisation Hh des cellules endothéliales.
Resumo:
Cette thèse est une collection de trois articles en économie de l'information. Le premier chapitre sert d'introduction et les Chapitres 2 à 4 constituent le coeur de l'ouvrage. Le Chapitre 2 porte sur l’acquisition d’information sur l’Internet par le biais d'avis de consommateurs. En particulier, je détermine si les avis laissés par les acheteurs peuvent tout de même transmettre de l’information à d’autres consommateurs, lorsqu’il est connu que les vendeurs peuvent publier de faux avis à propos de leurs produits. Afin de comprendre si cette manipulation des avis est problématique, je démontre que la plateforme sur laquelle les avis sont publiés (e.g. TripAdvisor, Yelp) est un tiers important à considérer, autant que les vendeurs tentant de falsifier les avis. En effet, le design adopté par la plateforme a un effet indirect sur le niveau de manipulation des vendeurs. En particulier, je démontre que la plateforme, en cachant une partie du contenu qu'elle détient sur les avis, peut parfois améliorer la qualité de l'information obtenue par les consommateurs. Finalement, le design qui est choisi par la plateforme peut être lié à la façon dont elle génère ses revenus. Je montre qu'une plateforme générant des revenus par le biais de commissions sur les ventes peut être plus tolérante à la manipulation qu'une plateforme qui génère des revenus par le biais de publicité. Le Chapitre 3 est écrit en collaboration avec Marc Santugini. Dans ce chapitre, nous étudions les effets de la discrimination par les prix au troisième degré en présence de consommateurs non informés qui apprennent sur la qualité d'un produit par le biais de son prix. Dans un environnement stochastique avec deux segments de marché, nous démontrons que la discrimination par les prix peut nuire à la firme et être bénéfique pour les consommateurs. D'un côté, la discrimination par les prix diminue l'incertitude à laquelle font face les consommateurs, c.-à-d., la variance des croyances postérieures est plus faible avec discrimination qu'avec un prix uniforme. En effet, le fait d'observer deux prix (avec discrimination) procure plus d'information aux consommateurs, et ce, même si individuellement chacun de ces prix est moins informatif que le prix uniforme. De l'autre côté, il n'est pas toujours optimal pour la firme de faire de la discrimination par les prix puisque la présence de consommateurs non informés lui donne une incitation à s'engager dans du signaling. Si l'avantage procuré par la flexibilité de fixer deux prix différents est contrebalancé par le coût du signaling avec deux prix différents, alors il est optimal pour la firme de fixer un prix uniforme sur le marché. Finalement, le Chapitre 4 est écrit en collaboration avec Sidartha Gordon. Dans ce chapitre, nous étudions une classe de jeux où les joueurs sont contraints dans le nombre de sources d'information qu'ils peuvent choisir pour apprendre sur un paramètre du jeu, mais où ils ont une certaine liberté quant au degré de dépendance de leurs signaux, avant de prendre une action. En introduisant un nouvel ordre de dépendance entre signaux, nous démontrons qu'un joueur préfère de l'information qui est la plus dépendante possible de l'information obtenue par les joueurs pour qui les actions sont soit, compléments stratégiques et isotoniques, soit substituts stratégiques et anti-toniques, avec la sienne. De même, un joueur préfère de l'information qui est la moins dépendante possible de l'information obtenue par les joueurs pour qui les actions sont soit, substituts stratégiques et isotoniques, soit compléments stratégiques et anti-toniques, avec la sienne. Nous établissons également des conditions suffisantes pour qu'une structure d'information donnée, information publique ou privée par exemple, soit possible à l'équilibre.
Resumo:
Sharing of information with those in need of it has always been an idealistic goal of networked environments. With the proliferation of computer networks, information is so widely distributed among systems, that it is imperative to have well-organized schemes for retrieval and also discovery. This thesis attempts to investigate the problems associated with such schemes and suggests a software architecture, which is aimed towards achieving a meaningful discovery. Usage of information elements as a modelling base for efficient information discovery in distributed systems is demonstrated with the aid of a novel conceptual entity called infotron.The investigations are focused on distributed systems and their associated problems. The study was directed towards identifying suitable software architecture and incorporating the same in an environment where information growth is phenomenal and a proper mechanism for carrying out information discovery becomes feasible. An empirical study undertaken with the aid of an election database of constituencies distributed geographically, provided the insights required. This is manifested in the Election Counting and Reporting Software (ECRS) System. ECRS system is a software system, which is essentially distributed in nature designed to prepare reports to district administrators about the election counting process and to generate other miscellaneous statutory reports.Most of the distributed systems of the nature of ECRS normally will possess a "fragile architecture" which would make them amenable to collapse, with the occurrence of minor faults. This is resolved with the help of the penta-tier architecture proposed, that contained five different technologies at different tiers of the architecture.The results of experiment conducted and its analysis show that such an architecture would help to maintain different components of the software intact in an impermeable manner from any internal or external faults. The architecture thus evolved needed a mechanism to support information processing and discovery. This necessitated the introduction of the noveI concept of infotrons. Further, when a computing machine has to perform any meaningful extraction of information, it is guided by what is termed an infotron dictionary.The other empirical study was to find out which of the two prominent markup languages namely HTML and XML, is best suited for the incorporation of infotrons. A comparative study of 200 documents in HTML and XML was undertaken. The result was in favor ofXML.The concept of infotron and that of infotron dictionary, which were developed, was applied to implement an Information Discovery System (IDS). IDS is essentially, a system, that starts with the infotron(s) supplied as clue(s), and results in brewing the information required to satisfy the need of the information discoverer by utilizing the documents available at its disposal (as information space). The various components of the system and their interaction follows the penta-tier architectural model and therefore can be considered fault-tolerant. IDS is generic in nature and therefore the characteristics and the specifications were drawn up accordingly. Many subsystems interacted with multiple infotron dictionaries that were maintained in the system.In order to demonstrate the working of the IDS and to discover the information without modification of a typical Library Information System (LIS), an Information Discovery in Library Information System (lDLIS) application was developed. IDLIS is essentially a wrapper for the LIS, which maintains all the databases of the library. The purpose was to demonstrate that the functionality of a legacy system could be enhanced with the augmentation of IDS leading to information discovery service. IDLIS demonstrates IDS in action. IDLIS proves that any legacy system could be augmented with IDS effectively to provide the additional functionality of information discovery service.Possible applications of IDS and scope for further research in the field are covered.
Resumo:
The shift from print to digital information has a high impact on all components of the academic library system in India especially the users, services and the staff. Though information is considered as an important resource, the use of ICT tools to collect and disseminate information has been in a slow pace in majority of the University libraries This may be due to various factors like insufficient funds, inadequate staff trained in handling computers and software packages, administrative concerns etc. In Kerala, automation has been initiated in almost all University libraries using library automation software and is under different stages of completion. There are not much studies conducted about the effects of information communication technologies on the professional activities of library professionals in the universities in Kerala. It is important to evaluate whether progress in ICT has had any impact on the library profession in these highest educational institutions. The aim of the study is to assess whether the developments in information communication technologies have any influence on the library professionals’ professional development, and the need for further education and training in the profession and evaluate their skills in handling developments in ICT. The total population of the study is 252 including the permanently employed professional library staff in central libraries and departmental libraries in the main campuses of the universities under study. This is almost a census study of the defined population of users. The questionnaire method was adopted for collection of data for this study, supplemented by interviews of Librarians to gather additional information. Library Professionals have a positive approach towards ICT applications and services in Libraries, but majority do not have the opportunities to develop their skills and competencies in their work environment. To develop competitive personnel in a technologically advanced world, high priority must be given to develop competence in ICT applications, library management and soft skills in library professionals, by the University administrators and Library associations. Library science schools and teaching departments across the country have to take significant steps to revise library science curriculum, and incorporate significant changes to achieve the demands and challenges of library science profession.
Resumo:
This thesis entitled “The right to freedom of information in india”.In a democracy, the citizens being the persons to choose their own governors, the right to know from the Government is a pre-condition for a properly evaluated election. Freedom of speech and expression, one of the repositories of self~government, forms the basis for the right to know in a wider scale. The functions which the free speech rights serve in a society also emphasize the need for more openness in the functioning of a democracy.Maintanance of law and order and investigation of crimes are highly important in a country like India, where no risk may be taken on account of the public‘s right to know. The Indian situations relating terrorist activities, riots based on language, region, religion and caste are important in this respect. The right to know of the citizens may be regulated in the interests of secrecy required in these areas.On the basis of the conclusions reached in this study, a draft Bill has been proposed for the passing of an Access to Public Documents Act. This Bill is appended to this Thesis.
Resumo:
This thesis attempts to investigate the problems associated with such schemes and suggests a software architecture, which is aimed towards achieving a meaningful discovery. Usage of information elements as a modelling base for efficient information discovery in distributed systems is demonstrated with the aid of a novel conceptual entity called infotron. The investigations are focused on distributed systems and their associated problems. The study was directed towards identifying suitable software architecture and incorporating the same in an environment where information growth is phenomenal and a proper mechanism for carrying out information discovery becomes feasible. An empirical study undertaken with the aid of an election database of constituencies distributed geographically, provided the insights required. This is manifested in the Election Counting and Reporting Software (ECRS) System. ECRS system is a software system, which is essentially distributed in nature designed to prepare reports to district administrators about the election counting process and to generate other miscellaneous statutory reports.