INFORMATICA : Branch of engineering that studies the information processing by using automatic machines. It is a broad field that includes the theoretical foundations , design , programming and the use of computers as tools for troubleshooting.
FEATURES:
1 No one can apply the traditional figures of the Law on the Information given its originality, so you have to create new figures.
2 risk to the exercise of the freedoms of citizens.
3 continuous emergence of computer goods of economic content they need legal protection.
4 multinational empire imposed a series of clauses that are opposed to Spanish law.
5 Influence of common law in language and in its doctrine.
6 A formation of two French words information and automatique
7 studies the automatic processing of information using electronic devices and computer systems.
8 Discipline charge of studying methods, processes, techniques, development and utilization
9 End to store, process and transmit information and data in digital format.
Also
called decimal system, numbering system is a positional numeral system in which
quantities are represented using arithmetic based powers of the number ten. The
set of symbols used (System of Arabic numerals) consists of ten digits: zero
(0) - one (1) - two (2) - three (3) - four (4) - five (5) - six (6) - seven (7)
- eight (8), nine (9).
Sebastián Ortiz
DEVELOPER
A software developer
is a person whose work is to research, analyze, design, implement and test
software that was required by a specific company or client. A software
developer can also create applications for anyone to use that can be
distributed in many ways across the internet or platforms dedicated to
distribute them. Software developers can be analysts or programmers and they
normally work in groups to be able to create software faster.
Santiago Valencia Buitrago
INTERNET
The
Internet is a global system of interconnected computer networks that use the
standard Internet protocol suite (TCP/IP) to link several billion devices
worldwide. It is a network of networks that consists of millions of private,
public, academic, business, and government networks of local to global scope,
linked by a broad array of electronic, wireless, and optical networking
technologies. The Internet carries an extensive range of information resources
and services, such as the inter-linked hypertext documents and applications of
the World Wide Web (WWW), the infrastructure to support email, and peer-to-peer
networks for file sharing and telephony.
MySQL is the most popular Open Source Relational SQL database management system.
MySQL is one of the best RDBMS being used for developing web-based software applications.
This tutorial will give you quick start with MySQL and make you comfortable with MySQL programming.
Audience
This reference has been prepared for the beginners to help them understand the basics to advanced concepts related to MySQL languages.
Prerequisites
Before you start doing practice with various types of examples given in this reference, I'm making an assumption that you are already aware about what is database, especially RDBMS and what is a computer programming language.
Compile/Execute MySQL Programs
If you are willing to compile and execute SQL programs with SQLite DBMS but you do not have a set up for the same, then do not worry. The compileonline.com is available on a high end dedciated server giving you real programming experience with a comfort of single click execution. Yes! it is absolutely free and it's online
CASE tools are set of software application programs, which are used to automate SDLC activities. CASE tools are used by software project managers, analysts and engineers to develop software system.
There are number of CASE tools available to simplify various stages of Software Development Life Cycle such as Analysis tools, Design tools, Project management tools, Database Management tools, Documentation tools are to name a few.
Use of CASE tools accelerates the development of project to produce desired result and helps to uncover flaws before moving ahead with next stage in software development.
An algorithm is a set
of instructions designed to perform a specific task. This can be a simple
process, such as multiplying two numbers, or a complex operation, such as
playing a compressed video file.
A computer program
can be viewed as an elaborate algorithm. In mathematics and computer science,
an algorithm usually means a small procedure that solves a recurrent problem.
In computer
programming, algorithms are often created as functions. These functions serve
as small programs that can be referenced by a larger program. For example, an
image viewing application may include a library of functions that each use a
custom algorithm to render different image file formats.
Norberto Cárdenas Luna
DATABASE
A database is an organized collection of
data.[1] The data is typically organized to model aspects of reality in a way
that supports processes requiring information. For example, modelling the
availability of rooms in hotels in a way that supports finding a hotel with
vacancies.
Database management systems are computer
software applications that interact with the user, other applications, and the
database itself to capture and analyze data. A general-purpose DBMS is designed
to allow the definition, creation, querying, update, and administration of
databases. Well-known DBMSs include MySQL, PostgreSQL, Microsoft SQL Server,
Oracle, Sybase and IBM DB2. A database is not generally portable across
different DBMSs, but different DBMS can interoperate by using standards such as
SQL and ODBC or JDBC to allow a single application to work with more than one
DBMS. Database management systems are often classified according to the
database model that they support; the most popular database systems since the
1980s have all supported the relational model as represented by the SQL
language. Sometimes a DBMS is loosely referred to as a 'database'.
Jhon Alexander Muñoz
DATA
Is a set of values of qualitative or quantitativevariables; restated, pieces of data are individual pieces of information. Data is measured, collected and reported, and analyzed, whereupon it can be visualized using graphs or images. Data as an general concept refers to the fact that some existing information or knowledge is represented or coded in some form suitable for better usage or processing.
Raw data, i.e., unprocessed data, refers to a collection of numbers, characters and is a relative term; data processing commonly occurs by stages, and the "processed data" from one stage may be considered the "raw data" of the next. Field data refers to raw data that is collected in an uncontrolled in situ environment. Experimental data refers to data that is generated within the context of a scientific investigation by observation and recording.
A firewall is a device that works as a firewall between networks, allowing or denying the transmissions from one network to another. A typical use is to place it between the local network and the Internet, as a safety device to prevent intruders from accessing your sensitive information. A firewall is simply a filter that controls all communications that are passed on from one network to another and depending on what you are allowed or denied your step.
In this way a firewall can allow from a local network to the Internet web services, mail and ftp, but not to the IRC that can be unnecessary for our work. We can also set the accesses that are made from the Internet to the local network and we can deny all or allow some services such as the web, (if it is that we have a web server and want accessible from the Internet).
A firewall can be a software or hardware device, i.e. a gizmo that connects between the network and the cable of the Internet connection, or a program that is installed on the machine that has the modem that connects to the Internet. We can even find computers very powerful computers and with specific software that the only thing they do is monitor the communications between networks.
Simple artificial lenguaje,serves to define and describe the objects in the database,structure, relations and constraints.Some examples of functions:creation, modification and deletion of tables that make up the database.
1. Create table: is used to create new tables in the database
The
homeostasis is a mechanism that regulates the internal environment to support a
condition that is stable and constant. The homeostasis is the characteristic of
an opened or closed system. The homeostasis allows that a system should have
the aptitude to adapt to other one.
The
organization can reach the firm, alone condition when one presents two
requirements, the direction and the progress. The direction means that in spite
of the fact that there are changes in the company, the same results or
conditions established are reached. The progress referred to the wished end, is
a degree of progress that is inside the limits defined as tolerable. The
progress can be improved when there reaches the condition proposed with minor
effort, major precision for a relatively minor and low effort condition of
great variability. The direction and the alone progress can be reached by
leadership and commitment.
Proxy is a system or IT and this program is a bridge by which there are transported requests sent by the client to another servant. The possible uses that it is possible to give him to this bridge are: reduction of traffic by means of the implementation of cachè proxy, improves of the speed, to leak contents, to hide to the web servant the identity between other many things.
A major definiciòn serious technology: A servant proxy is an equipment that acts of intermediary between a web explorer (as Internet Explorer) and Internet. The servants proxy help to improve the performance in Internet since they store a copy of the most used web pages. When an explorer requests a web page stored in the collection (his I broke) of the servant proxy, the servant proxy her provides, which turns out to be more rapid that to consult the Web. The servants proxy also help to improve the safety, since they leak some web contents and ill-disposed software. The servants proxy are in use often in networks of organizations and companies. Normally, the persons who connect to Internet from house do not use a servant proxy.
The entropy, in the theory of the information, is a magnitude that measures the information provided by a data source, that is to say, what contributes us on an information or concrete fact. For example, that say to us that the streets are wetted, knowing that it has just rained, little information contributes us, because it is the habitual thing. But if they say to us that the streets are wetted and we know that it has not rained, it contributes a lot of information (because they do not water them every day). One notice that in the previous example the quantity of information is different, in spite of treating itself about the same message: The streets are wet. On it there are based the technologies of compression of information, which allow to pack the same information in more short messages. The measure of the entropy can be applied to sources of information of any nature, and allows us to codify it adequately, indicating the code elements necessary to us to transmit it, eliminating any redundancy. (To indicate the result of a coarse career of horses in spite of transmitting the code associated with the winning horse, it is not necessary to tell that it is either a career of horses or his development).
The entropy also can be considered to be the quantity of average information that the secondhand symbols contain. The symbols with minor probability are those who contribute major information; for example, if it is considered to be a system of symbols to the words in a text, frequent words as "which", "", "a" contribute little less frequent information, whereas words since "they" "run", "child", "dog" contribute more information. If of a given text we erase one "that", surely it will not concern the comprehension and will be implied, not being like that if we erase the word "child" of the same original text. When all the symbols are equally probable (distribution of flat probability), they all contribute relevant information and the entropy is maximum.
Ethernet defines the characteristics of wired up and signposting of physical level and the formats of plots of information of the level of link of information of the model OSI. (Network Lan) A local area network (LAN) is a group of computers connected to an area located to communicate between yes and to share resources like, for example, printers. The information is sent in the shape of packages, for whose transmission they can use diverse technologies.
A patch is a
piece of software designed to update a computer program or its supporting data,
to fix or improve it. This includes fixing security vulnerabilities and other
bugs, and improving the usability or performance. Though meant to fix problems,
poorly designed patches can sometimes introduce new problems (see software
regressions). In some special cases updates may knowingly break the
functionality, for instance, by removing components for which the update
provider is no longer licensed or disabling a device.
Patch management is the process of using a strategy
and plan of what patches should be applied to which systems at a specified
time.
Patches for proprietary software are typically
distributed as executable files instead of source code. This type of patch
modifies the program executable—the program the user actually runs—either by
modifying the binary file to include the fixes or by completely replacing it.
Patches can also circulate in the form of source code
modifications. In this case, the patches usually consist of textual differences
between two source code files, called "diffs". These types of patches
commonly come out of open source projects. In these cases, developers expect
users to compile the new or changed files themselves.
Because the word "patch" carries the
connotation of a small fix, large fixes may use different nomenclature. Bulky
patches or patches that significantly change a program may circulate as
"service packs" or as "software updates". Microsoft Windows
NT and its successors (including Windows 2000, Windows XP, and later versions)
use the "service pack" terminology.
The size of patches may vary from a few kilobytes to
hundreds of megabytes; thus, more significant changes imply a larger size,
though this also depends on whether the patch includes entire files or only the
changed portion(s) of files. In particular, patches can become quite large when
the changes add or replace non-program data, such as graphics and sounds files.
Such situations commonly occur in the patching of computer games. Compared with
the initial installation of software, patches usually do not take long to
apply.
In
the case of operating systems and computer server software, patches have the
particularly important role of fixing security holes. Some critical patches involve
issues with drivers.
Patches may require prior application of other
patches, or may require prior or concurrent updates of several independent
software components. To facilitate updates, operating systems often provide
automatic or semi-automatic updating facilities. Completely automatic updates
have not succeeded in gaining widespread popularity in corporate computing
environments, partly because of the aforementioned glitches, but also because
administrators fear that software companies may gain unlimited control over
their computers.[citation needed] Package management systems can offer various
degrees of patch automation.
Usage of completely automatic updates has become far
more widespread in the consumer market, due largely[citation needed] to the
fact that Microsoft Windows added support for them[when?], and Service Pack 2
of Windows XP (available in 2004) enabled them by default. Cautious users,
particularly system administrators, tend to put off applying patches until they
can verify the stability of the fixes. Microsoft (W)SUS support this. In the
cases of large patches or of significant changes, distributors often limit
availability of patches to qualified developers as a beta test.
Applying patches to firmware poses special challenges,
as it often involves the provisioning of totally new firmware images, rather
than applying only the differences from the previous version. The patch usually
consists of a firmware image in form of binary data, together with a
supplier-provided special program that replaces the previous version with the
new version; a motherboard BIOS update is an example of a common firmware
patch. Any unexpected error or interruption during the update, such as a power
outage, may render the motherboard unusable. It is possible for motherboard
manufacturers to put safeguards in place to prevent serious damage; for
example, the upgrade procedure could make and keep a backup of the firmware to
use in case it determines that the primary copy is corrupt (usually through the
use of a checksum, such as a CRC).
Batch processingis the execution of a series ofprograms("jobs") on acomputerwithout manual intervention.
Jobs
are set up so they can be run to completion without human interaction. All
inputparametersare predefined throughscripts,command-line arguments, control files, orjob control language. This is in contrast to "online" orinteractive programswhich prompt the user for such input. A program
takes a set of data files as input, processes the data, and produces a set of
output data files. This operating environment is termed as "batch
processing" because the input data are collected intobatchesor sets of records and each batch is
processed as a unit. The output is anotherbatchthat can be reused for computation.