Otik

Reviewed by Editorial Team
The ProProfs editorial team is comprised of experienced subject matter experts. They've collectively created over 10,000 quizzes and lessons, serving over 100 million users. Our team includes in-house content moderators and subject matter experts, as well as a global network of rigorously trained contributors. All adhere to our comprehensive editorial guidelines, ensuring the delivery of high-quality content.
Learn about Our Editorial Process
| By Singi123
S
Singi123
Community Contributor
Quizzes Created: 3 | Total Attempts: 5,341
| Attempts: 1,088 | Questions: 107
Please wait...
Question 1 / 107
0 %
0/100
Score 0/100
1.
Znanje se oblikuje ako:

Explanation

The correct answer states that knowledge is formed when we use information for action. This means that knowledge is not simply acquired passively, but is actively applied and utilized in our actions and decision-making. By using information to guide our actions, we are able to gain experience and learn from the outcomes, which leads to the formation and development of knowledge.

Submit
Please wait...
About This Quiz
Otik - Quiz

This is your description.

Tell us your name to personalize your report, certificate & get on the leaderboard!
2.
Mudrost se oblikuje:

Explanation

Wisdom is shaped through the use of knowledge.

Submit
3.
Entropija binarnog izvora je maksimalna kada je verovatnoca poruka:

Explanation

The entropy of a binary source is maximized when the probabilities of the two messages are equal. In this case, the probabilities are P{0.5,0.5}, meaning that both messages have an equal chance of occurring. This leads to the highest level of uncertainty and therefore the maximum entropy.

Submit
4.
Podatak je:

Explanation

The given answer suggests that the correct answer is "Najosnovniji nivo" because it is the most basic level in the DIKW (Data, Information, Knowledge, Wisdom) structure. This implies that the other options, "Napredan nivo definicije informacija" (Advanced level of defining information) and "Srednji nivo DIKW strukture" (Intermediate level of the DIKW structure), are not as fundamental as the first option. Therefore, "Najosnovniji nivo" is the correct answer.

Submit
5.
Razlika izmedu podataka i informacije:

Explanation

The correct answer is "Postoji" which means "Exists" in English. This suggests that there is a difference between data and information. The explanation for this answer could be that data refers to raw facts or figures, while information is the processed and organized data that has meaning and context. Therefore, the existence of this difference implies that data and information are not the same thing.

Submit
6. Kod je kompletan kada:

Explanation

A complete code is one where there are no unused leaves in the n-ary code tree. This means that all the leaves in the tree have been utilized and assigned a code.

Submit
7. Kada kodujemo poruku radi tajnosti:

Explanation

The correct answer is "Poruku stitimo od neautorizovanog pristupa." This means that when we encode a message for secrecy, we are protecting the message from unauthorized access. This suggests that the purpose of encoding a message is to ensure that only authorized individuals can understand and access its content, thereby safeguarding its confidentiality.

Submit
8. Novine se mogu smatrati jednim diskretnim izvorom informacija, nad alfabetom datog jezika:

Explanation

If the alphabet is finite, then the newspapers can be considered as a discrete source of information. This means that the information provided by the newspapers can be divided into distinct and separate units, such as individual letters or words. In contrast, if the alphabet is not finite, the newspapers may contain an infinite number of possible characters or symbols, making it difficult to categorize the information as discrete.

Submit
9. Potpuno n-arno stablo je:

Explanation

A potpuno n-arno stablo is a tree in which all the leaves have the same depth.

Submit
10.
Broj 17 je:

Explanation

The number 17 is considered a "podatak" because it is a numerical value that can be used to provide information or represent a quantity.

Submit
11.
Informacija je:

Explanation

The correct answer is "Primljena I shvacena poruka" because it implies that the information has been received and understood. The other option, "Primljena I poslata poruka," only indicates that the message has been received and sent, but does not guarantee that it has been comprehended. Therefore, the correct answer suggests that the information has been both received and processed effectively.

Submit
12.
Pravilna hijerarhija priramide znanja je:

Explanation

The correct answer is "Podaci, informacija, znanje, mudrost." This hierarchy represents the progression of data to wisdom. Data refers to raw facts and figures, which can be processed and organized to become information. Information is meaningful data that can be used to gain knowledge. Knowledge is the understanding and application of information, and wisdom is the ability to make wise decisions based on knowledge and experience. Therefore, the correct order is data, information, knowledge, and wisdom.

Submit
13.
Informacija :

Explanation

This answer suggests that the correct answer is "Dodaje kontekst" (Adds context). This means that the given information is adding additional information or details to a situation or topic, which helps to provide a clearer understanding or perspective. It implies that the information is enhancing or enriching the existing context.

Submit
14. Kod verovatnosnog n-arnog stable:

Explanation

The given answer states that it is possible to calculate the probabilities of internal nodes in the probabilistic n-ary tree. This implies that the probabilities of nodes within the tree can be determined, suggesting that the tree has enough information or data to compute these probabilities accurately.

Submit
15. Medju jednoznacnim kodovima od posebnog znacaja su tzv.:

Explanation

Prefix codes are a type of uniquely decodable codes where no codeword is a prefix of any other codeword. This means that the decoding process is unambiguous, as we can always determine the end of one codeword and the beginning of the next. Prefix codes are commonly used in data compression algorithms such as Huffman coding, where the most frequently occurring symbols are assigned shorter codewords for efficient encoding and decoding. Suffix codes, on the other hand, are not mentioned in the question and therefore not relevant to the given answer.

Submit
16.
Imamo dve poruke i njihove kodne reci dobijene entropijskim kodovanjem. Poruka A = {01} a poruka B = {101}.

Explanation

Based on the given information, we can determine that the message A has a higher probability than message B. This is because message A is represented by the code word "01" which is shorter than the code word "101" used for message B. In entropy coding, shorter code words are assigned to more probable events, while longer code words are assigned to less probable events. Therefore, message A having a shorter code word implies that it has a higher probability.

Submit
17.
Ako je podatku dodeljeno znacenje:

Explanation

When a meaning is assigned to data, it means that the data is interpreted and understood. This interpretation allows us to gain information from the data, as it provides us with insight and knowledge. Therefore, the correct answer is that we obtain information when a meaning is assigned to data.

Submit
18.
Opsteprihvaceno je misljenje:

Explanation

The given answer states that "podatak" (data) is smaller than "informacija" (information), and "informacija" is larger than "podatak". This implies that data is a subset of information, meaning that information contains more details and context than just raw data. This aligns with the common understanding that data is raw, unprocessed facts, while information is data that has been organized, analyzed, and given meaning. Therefore, the correct answer suggests that information is greater in terms of quantity and quality compared to data.

Submit
19. U verovatnosnom n-arnom stablu zbirna verovatnoca svih listova je:

Explanation

In a probabilistic n-ary tree, the sum of probabilities of all the leaves is equal to 1.

Submit
20. Prefiks kod je:

Explanation

The correct answer is "Jednoznacno dekodiv." This term refers to a prefix code that can be uniquely decoded. In other words, each encoded sequence in the code has only one possible decoding, ensuring that there is no ambiguity or confusion in the decoding process. This property is important in areas such as data compression and error correction, where the accurate retrieval of information is crucial.

Submit
21.
Poruka "Sada je 22h." je:

Explanation

The given correct answer states that the message "Sada je 22h." is an information because we add meaning to the data. In this case, the statement "Sada je 22h." provides specific information about the time, indicating that it is currently 22:00. This adds meaning to the data of the time and transforms it into information that can be understood and interpreted by the recipient.

Submit
22. H(X|y) oznacava:

Explanation

The correct answer is "Uslovna entropija varijable X u odnosu na dogadjaj ili observaciju Y=y." This is because "H(X|y)" represents the conditional entropy of variable X given the event or observation Y=y.

Submit
23.
Informacija je:

Explanation

The correct answer is "Skup poruka u nekom kontekstu" which translates to "A set of messages in some context." This answer suggests that information is a collection of messages that are meaningful and relevant within a specific context or situation. It implies that information is not just random or isolated messages, but rather has a purpose and is connected to a particular context or setting.

Submit
24.
Funkcija koja zadovoljava sve uslove za merenje kolicine informacije je:

Explanation

The correct answer is "Logaritam" because the logarithm is a mathematical function that measures the amount of information in a given context. It is commonly used in information theory to quantify the amount of information contained in a message or data set. The logarithm helps to compress and represent information in a more concise and efficient manner. Additionally, the properties of logarithms, such as the ability to combine and simplify expressions, make it a useful tool for measuring and analyzing information.

Submit
25.
Razlika izmedu znanja i razumevanja je kao i izmedu ucenja i memorisanja.

Explanation

The statement is comparing the difference between knowledge and understanding to the difference between learning and memorization. It suggests that just as learning is different from memorizing, knowledge is different from understanding. This implies that knowledge involves acquiring information, while understanding involves comprehending and applying that knowledge in a meaningful way.

Submit
26. Za svaki n-arni prefiksni kod postoji barem jedno n-arno kodno stablo takvo da svaka kodna rec odgovara sekvenci oznaka na jedinstvenom putu od korena stabla do lista.

Explanation

This statement is true. It states that for every n-ary prefix code, there exists at least one n-ary code tree in which each codeword corresponds to a sequence of labels on a unique path from the root of the tree to a leaf. This means that there is a one-to-one mapping between the codewords and the paths in the tree, ensuring that each codeword can be uniquely decoded.

Submit
27.
Informacija je:

Explanation

The correct answer is "Inkrement znanja." This suggests that the information is related to increasing or improving knowledge. It implies that the information provided will contribute to the expansion or enhancement of one's understanding or expertise in a particular subject or field.

Submit
28.
Kada se koristi logaritam sa osnovom dva:

Explanation

When using a logarithm with base two, the quantity of information is expressed in bits. This is because the binary system, which is based on two digits (0 and 1), is commonly used to represent and store information in computers. In the binary system, each bit represents a single unit of information, making it the appropriate unit of measurement when using a logarithm with base two.

Submit
29.
Entropija je primenljiva kao mera neodredjenosti za:

Explanation

Entropy is applicable as a measure of uncertainty for systems of possibilities where the probabilities of individual outcomes are not known to us.

Submit
30. Osnovni cilj efikasnog kodovanja je kodovanje datog izvora informacija tako da je:

Explanation

The basic goal of efficient coding is to encode the given source of information in such a way that the average length of the code is minimized. By minimizing the average length of the code, we can achieve a more efficient representation of the information, reducing the amount of storage or transmission required. This helps in optimizing resources and improving overall efficiency.

Submit
31. Problem koji je moguce resavati zahvaljujuci Senonovom konceptu mera neodredjenosti jeste:

Explanation

The correct answer is "efikasno kodovanje diskretnog izvora informacija u formi niza simbola" which translates to "efficient coding of a discrete source of information in the form of a symbol sequence." This is because Shannon's concept of uncertainty measures provides a framework for effectively encoding information in a way that minimizes redundancy and maximizes efficiency. By using techniques such as Huffman coding or arithmetic coding, the information can be compressed and transmitted more efficiently, reducing the amount of data needed to represent the source.

Submit
32. Dimenzija alfabeta se naziva:

Explanation

The correct answer is "Arnost" because the term "dimenzija alfabeta" is a Croatian phrase that translates to "alphabet dimension" in English. "Arnost" is a term used in mathematics to refer to the dimension of a vector space, which aligns with the concept of the dimension of an alphabet. "Parnost" means evenness or parity, and "ortogonalnost" means orthogonality, neither of which are related to the dimension of an alphabet.

Submit
33. Za jedan kod se kaze da je ___________ako u njemu ni jedna kodna rec nije prefiks nekoj dugoj kodnoj reci.

Explanation

A code is said to be "prefiksan" if none of its codewords is a prefix of another codeword.

Submit
34.
Ako je verovatnoca poruke veca:

Explanation

When the probability of a message is higher, it means that the message is more likely to occur. In information theory, the amount of information in an event is inversely proportional to its probability. Therefore, if the probability of a message is higher, the amount of information it carries is lower. Hence, the correct answer is that the amount of information is smaller when the probability of a message is higher.

Submit
35. I(X;Y)=I(Y;X)

Explanation

The given statement "I(X;Y) = I(Y;X)" is true. This equation represents the mutual information between two random variables X and Y, which measures the amount of information that X and Y share. The mutual information is symmetric, meaning that it does not matter which variable is considered as X or Y, the result will be the same. Therefore, the statement is correct.

Submit
36. Uloga kodera je:

Explanation

The correct answer is "To transform source messages into codewords." The role of a coder is to convert source messages into codewords. This process involves encoding or transforming the information in the source messages into a format that can be easily transmitted or stored. By doing so, the coder ensures that the information is accurately represented and can be efficiently decoded by the receiver.

Submit
37.
Ako je entropija izvora manja od kapaciteta kanala:

Explanation

If the entropy of the source is less than the channel capacity, it means that the source has less uncertainty or randomness compared to the capacity of the channel to transmit information. In this case, it is possible to design a communication scheme that can asymptotically transmit data without errors. This means that with enough time and resources, the communication scheme can achieve error-free transmission.

Submit
38.
Kod entropijskog kodovanja:

Explanation

In entropy encoding, messages with higher probability are assigned shorter codes, while messages with lower probability are assigned longer codes. This is because assigning shorter codes to more probable messages allows for more efficient encoding, as these messages occur more frequently and thus require fewer bits to represent. On the other hand, assigning longer codes to less probable messages ensures that the encoding is still uniquely decodable, as longer codes are less likely to be encountered and are therefore less likely to cause ambiguity during decoding.

Submit
39. Kada listovi stabla ne korespondiraju ni jednoj kodnoj reci, takvi listovi se nazivaju:

Explanation

The correct answer is "neiskorisceni listovi". This term refers to the leaves of a tree that do not correspond to any code word. In other words, they are unused or not utilized in the encoding process. "Nejednoznacni listovi" would refer to ambiguous leaves, which is not the case here.

Submit
40.
Informacija se sastoji od:

Explanation

The correct answer is "Podatkai i dodeljenog znacenja" because information consists of data and assigned meaning. Data alone does not have any significance or meaning until it is processed and interpreted, which is when it becomes information. Therefore, the combination of data and assigned meaning is necessary for information to exist.

Submit
41.
Znanje odreduje:

Explanation

The correct answer is "Nacin upotrebe" because it refers to the way something is used or applied. Knowledge determines the manner in which something is utilized or put into practice. This implies that understanding or familiarity with a subject or skill is essential in determining how it is utilized.

Submit
42. Neodredjenost zdruzenog sistema je:

Explanation

The correct answer is "always less than the sum of the uncertainties of individual systems." This is because when two or more systems are combined, the total uncertainty is reduced due to the cancellation of some individual uncertainties. This phenomenon is known as the "error cancellation effect" or "error propagation." Therefore, the combined uncertainty is always smaller than the sum of uncertainties of the individual systems.

Submit
43.
Podatak dolazi u obliku neobradenih zapazanja i dimenzija.

Explanation

The explanation for the given correct answer is that the data is in the form of unprocessed observations and dimensions. This implies that the data has not been analyzed or manipulated in any way and is in its raw form. Therefore, the answer "Da" (Yes) is correct as it acknowledges that the data is indeed in its unprocessed state.

Submit
44. Korenu verovatnosnog n-arnog stabla dodeljena je verovatnoca:

Explanation

The given answer is 1. This suggests that the probability assigned to the root of the probabilistic n-ary tree is 1.

Submit
45.
Ako je verovatnoca prve poruke jednaka p:

Explanation

The correct answer is that the probability of the second message is (1-p). This can be inferred from the given information that the probability of the first message is p. Since the probability of an event and its complement must add up to 1, the probability of the second message would be (1-p).

Submit
46.
Senonova granica predstavlja:

Explanation

The Senonova granica refers to communication at the maximum possible speed, near the channel capacity, with zero transmission errors.

Submit
47. H(X|E) oznacava:

Explanation

The correct answer is "Uslovna entropija varijable X u odnosu na dogadjaj E." This is because "H(X|E)" represents the conditional entropy of variable X given event E.

Submit
48.
Razlika izmedu znanja i razumevanja je kao i izmedu pisanja I citanja.

Explanation

The statement suggests that there is a difference between knowledge and understanding, similar to the difference between writing and reading. However, this is not a correct statement. Knowledge and understanding are closely related concepts and are often used interchangeably. While reading and writing are different skills, knowledge and understanding are not mutually exclusive. One can acquire knowledge through reading and also develop a deeper understanding of a subject through writing and reflecting on that knowledge. Therefore, the statement that the difference between knowledge and understanding is like the difference between writing and reading is not accurate.

Submit
49.
Sa turbo kodovima dobijamo:

Explanation

Turbo codes are a type of error correction code that allows for communication at or near the channel capacity with minimal error. This means that with turbo codes, we can achieve communication at the maximum possible speed close to the channel capacity, with zero transmission errors.

Submit
50. Kodno stablo i prefiksni kod su:

Explanation

Kodno stablo i prefiksni kod su ekvivalentni objekti. Kodno stablo je binarno stablo koje se koristi za kodiranje informacija, dok je prefiksni kod metoda kodiranja koja se koristi za mapiranje simbola u niske bitova. Oba koncepta se koriste za efikasno kodiranje i dekodiranje informacija, te su stoga ekvivalentni objekti.

Submit
51. H(XlY)≤

Explanation

The given expression H(X|Y) ≤ H(X) represents the conditional entropy of X given Y is less than or equal to the entropy of X. This means that the uncertainty or information content of X, given the knowledge of Y, is less than or equal to the uncertainty of X alone. In other words, knowing Y reduces the uncertainty in X, which is reflected in the conditional entropy being smaller than the entropy of X.

Submit
52. Kod diskretnog izvora informacija se naziva jednoznacno dekodivim:

Explanation

The correct answer is "if and only if any finite sequence of codewords uniquely corresponds to one source message". This means that for a discrete source code, each sequence of codewords can be uniquely decoded to one source message without any ambiguity. There is a one-to-one mapping between the codewords and source messages, ensuring that the decoding process is unambiguous.

Submit
53.
Maksimalna vrednost entropije binarnog izvora je:

Explanation

The maximum value of entropy for a binary source is 1 bit. This means that the source has equal probability for each of the two possible outcomes. In other words, the source is completely unpredictable and provides the maximum amount of information per symbol, which is 1 bit.

Submit
54.
Redudansa je:

Explanation

Redudansa refers to the redundancy or duplication of information in a communication system. It is the presence of unnecessary or repetitive data that can be eliminated without loss of information. In this context, suvisnost jednog izvora informacija means the redundancy or duplication of information from one source. This suggests that the correct answer is referring to the redundancy of information from a single source.

Submit
55. Osnovna šema kodovanja izvora (source coding):

Explanation

The given answer states that the source symbol Ut at time t is transformed into the corresponding codeword Zt. This suggests that there is a coding scheme where each source symbol is mapped to a specific codeword.

Submit
56. Izvor je bez memorije ukoliko:

Explanation

A source is memoryless if the probability of symbol Ui does not depend on previously emitted values. This means that each symbol is emitted independently and the probability distribution remains constant over time, regardless of the past emitted values.

Submit
57.
Sta je krajnja brzina komuniciranja za zadati komunikacioni kanal?(Shannon)

Explanation

The correct answer is "Kapacitet kanala C" (Channel capacity C). Channel capacity refers to the maximum rate at which information can be reliably transmitted through a communication channel. It is determined by the channel's bandwidth and signal-to-noise ratio. The higher the channel capacity, the faster the communication speed. In this context, "krajnja brzina komuniciranja" (maximum communication speed) is synonymous with channel capacity.

Submit
58.
Mera za kolicinu informacije treba biti takva da monotono pada sa porastom verovatnoce poruke.

Explanation

The statement is saying that the amount of information should decrease as the probability of the message increases. This is true because if a message is very probable, it carries less information since it is expected and not surprising. On the other hand, if a message is unlikely, it carries more information because it is unexpected and surprising. Therefore, the statement is correct.

Submit
59. Lokalna uzajamna informacija slucajne velicine Y i X govori o tome koliko jedna opservacija:

Explanation

This answer states that Y=y provides information about the other random variable X. This means that knowing the specific value of Y allows us to gain insight or make predictions about the value of X.

Submit
60. Uzajamna informacija uzima je mera  meduzavisnosti dve slucajne velicine koje uzimaju vrednosti:

Explanation

The explanation for the given answer is that mutual information measures the degree of dependence between two random variables. In this case, it is stated that the mutual information is taken over the same set, indicating that the two random variables are being compared and analyzed based on the same set of values. This implies that the mutual information is being calculated by considering the relationship and interdependence between the two variables within the context of the same set of values.

Submit
61. Generator poruka odnosno sekvence simbola je:

Explanation

The correct answer is "Izvor informacija" because in the context of communication systems, the "Izvor informacija" refers to the source of information or the message generator. This could be a person, a device, or any entity that produces the initial message or sequence of symbols to be transmitted through the communication channel. The "Komunikacioni kanal" refers to the channel or medium through which the message is transmitted. Therefore, the correct answer is the first option, "Izvor informacija," as it represents the initial stage of the communication process.

Submit
62. Ako je alfabet konacan:

Explanation

The given answer states that the information source is discrete. This means that the information provided is in a finite set or can be divided into distinct categories. In contrast, a continuous information source would have an infinite number of possibilities or could be measured on a continuous scale. Since the question mentions that the alphabet is finite, it implies that the information source is also finite and therefore discrete.

Submit
63.
Nacin upotrbe odreduje:

Explanation

Znanje je određeno nacinom upotrebe jer se stiče kroz iskustvo, učenje i razumijevanje. Način na koji koristimo informacije i podatke koje imamo određuje naše znanje. Mudrost je više povezana s primjenom znanja u praksi, dok su informacije i podaci samo sirov materijal iz kojeg znanje može biti izvedeno.

Submit
64.
Za optimalno kodovanje poruke:

Explanation

In order to achieve optimal coding of a message, it is important to describe the quantity of information in the messages. This means that the coding should accurately represent the information content of the message. It is not relevant whether the quantity of information is below the capacity of the channel or if there are only a few messages. The focus should be on accurately describing the information in the messages.

Submit
65.
Kontekst odreduje:

Explanation

The context determines the meaning or significance of something. In this case, the correct answer is "Informacija" which means "Information" in English. The context in which something is presented or discussed can greatly impact its interpretation and understanding.

Submit
66.
Ako neko posalje dve poruke:

Explanation

The correct answer is "Ukupna kolicina informacije je jednaka zbiru pojedinacne kolicine informacije dve poruke." This statement is based on the principle that the total amount of information is equal to the sum of the individual amounts of information of two messages. In other words, when two messages are sent, the total amount of information transmitted is the sum of the amount of information in each message.

Submit
67.
Entropija H za slucajno postavljanje figure na praznu sahovsku tablu iznosi:

Explanation

The entropy H represents the amount of uncertainty or randomness in a system. In this case, the entropy H for randomly placing a figure on an empty chessboard is 6. This means that there are 6 possible positions on the chessboard where the figure can be placed randomly.

Submit
68.
Kodiranje je proces:

Explanation

Kodiranje je proces dodeljivanja koda poruci pre slanja kroz kanal.

Submit
69.
Laplasovim principom nedovoljnog razloga, koji tvrdi da ukoliko ne znamo nista odredeno o nekoj pojavi:

Explanation

The correct answer is "razumno je pretpostaviti jednake verovatnoce mogucih ishoda" because Laplace's principle of insufficient reason states that when we have no specific information about an event, it is reasonable to assume that all possible outcomes are equally likely. This principle is based on the idea of fairness and lack of bias in the absence of evidence. Therefore, it is logical to assume equal probabilities for all possible outcomes when we have no specific knowledge about the event.

Submit
70. Verovatnosno n-arno stablo je n-arno stablo cijim cvorovima su dodeljene:

Explanation

A verovatnosno n-arno stablo is a tree where each node is assigned probabilities. The correct answer states that the nodes are assigned "dodeljene verovatnoce," which translates to "assigned probabilities" in English. Therefore, the explanation confirms that the correct answer accurately describes the assignment of probabilities to the nodes in a verovatnosno n-arno stablo.

Submit
71. Sta je razlog da uopste nesto kodujemo?

Explanation

Kodovanje se koristi radi kompresije kako bi se smanjila veličina podataka i olakšao njihov prenos. Takođe se koristi radi kvalitetnog prenosa kako bi se osiguralo da podaci budu tačno i pouzdano preneti. Takođe, kodovanje se koristi radi tajnosti kako bi se osiguralo da samo određeni korisnici mogu pristupiti i razumeti podatke.

Submit
72.
Do koje mere se neki skup podataka moze komprimovati?(Shannon)

Explanation

The correct answer is "Entropija izvora H." Entropija izvora meri količinu informacija koja se može komprimovati u skupu podataka. Veća entropija znači da je skup podataka manje kompresibilan, dok manja entropija ukazuje na veću mogućnost kompresije. Stoga, do koje mere se neki skup podataka može komprimovati zavisi od entropije izvora.

Submit
73. Izvor informacija je stacionaran ukoliko:

Explanation

The correct answer is "verovatnoce emitovanja simbola P(Ut=Ui) ne zavise od vremena t." This means that the probabilities of emitting symbols do not depend on time t. In other words, the source of information is stationary if the probabilities of emitting symbols remain constant over time.

Submit
74.
Kapacitet kanala C definise:

Explanation

The correct answer is "Krajnju brzinu datog komunikacionog kanala." This means that the capacity of channel C is defined as the maximum speed at which data can be transmitted through the communication channel.

Submit
75.
Muzika ili govor poseduju kompleksnost ispod koje se signali ovih izvora ne mogu dalje komprimovati bez gubitaka. Ovu kompleksnost je nazvao:

Explanation

The correct answer is "Entropija H." The explanation is that the complexity of music or speech signals cannot be further compressed without loss. This complexity is referred to as entropy.

Submit
76.
Racunarstvo je limitirano komunikacijom, dok su komunikcije racunarski limitirane.

Explanation

The statement suggests that computer science is limited by communication, while communication is limited by computer science. This implies that computer science relies on effective communication to function properly, while communication is influenced and shaped by advancements in computer science. Therefore, the statement is true as it highlights the interconnectedness between these two fields.

Submit
77.
Sa ukupno tri bita moguce je kodovati:

Explanation

With a total of three bits, it is possible to encode eight different messages. This is because each bit can have two possible values (0 or 1), and with three bits, there are 2^3 = 8 possible combinations. Therefore, there can be eight different messages encoded using these three bits.

Submit
78.
Entropija izvora H definise:

Explanation

The correct answer is "Meru do koje se neki skup podataka moze komprimovati, Prosecnu kolicinu informacija." The explanation for this answer is that entropy of a source H defines the measure to which a set of data can be compressed and the average amount of information. Entropy represents the amount of uncertainty or randomness in a source, and by measuring it, we can determine the maximum compression that can be achieved on a given dataset. Additionally, entropy also represents the average amount of information contained in each symbol or data point from the source.

Submit
79. Kodovanje radi kompresije predstavlja:

Explanation

Kodovanje radi kompresije predstavlja odbacivanje redudanse. Kada se vrši kompresija podataka, cilj je smanjiti veličinu podataka bez gubitka informacija. Odbacivanje redudanse se odnosi na eliminaciju suvišnih ili nepotrebnih podataka kako bi se smanjila ukupna veličina datoteke. Ovo se postiže identifikovanjem i uklanjanjem ponavljajućih ili nepotrebnih delova podataka. Odbacivanje redudanse je ključni korak u procesu kompresije kako bi se postigla efikasna i optimalna veličina datoteke.

Submit
80.
Entropija izvora predstavlja:

Explanation

Entropija izvora predstavlja prosečnu količinu informacije koju izvor generiše. Ova mera se koristi za kvantifikaciju neizvesnosti ili nesigurnosti u informacijama koje izvor proizvodi. Prosečna količina informacije se računa kao suma verovatnoća svih mogućih događaja pomnožena sa negativnim logaritmom te verovatnoće. Što je veća entropija, to je veća nesigurnost ili neizvesnost u informacijama koje izvor generiše.

Submit
81.
Poruka A ima verovatnocu ?, a poruka B ?

Explanation

The answer is that the quantity of information is greater in message A. This is because the question states that message A has a probability of ?, while message B has a probability of ?. Since the probabilities are not specified, we can assume that the probability of message A is higher than the probability of message B. In general, the higher the probability of an event, the less surprising or informative it is. Therefore, if the probability of message A is higher, it means that message A contains less surprising or expected information compared to message B, making the quantity of information greater in message A.

Submit
82. U verovatnosnom n-arnom stablu, prosecna (ocekivana) vrednost dubine listova jednaka je:

Explanation

The average (expected) value of the depth of the leaves in a probabilistic n-ary tree is equal to the sum of the probabilities of the internal nodes. This means that we calculate the average depth of the leaves by summing up the probabilities of all the internal nodes in the tree.

Submit
83. Entropija izvora predstavlja fundamentalno ogranicenje:

Explanation

Entropija izvora predstavlja fundamentalno ograničenje za kompresiju podataka. To znači da ne možemo komprimirati podatke ispod entropije izvora, jer bi to rezultiralo gubitkom informacija. Entropija izvora predstavlja meru neizvesnosti ili informacionog sadržaja izvora, i što je veća entropija, to su podaci manje predvidljivi i manje kompresibilni. Dakle, entropija izvora je bitna za određivanje optimalnih algoritama kompresije podataka.

Submit
84.
Entropijsko kodovanje je metod:

Explanation

Entropijsko kodovanje je metod koji se koristi za određivanje jednoznačnog koda minimalne dužine. To znači da se koristi za kodiranje poruka na način koji minimizira dužinu koda, a istovremeno obezbeđuje jednoznačnost dekodiranja. Ovaj metod se zasniva na konceptu entropije, koja predstavlja meru neizvesnosti ili informacije u poruci. Entropijsko kodovanje koristi statističke podatke o frekvenciji pojavljivanja simbola u poruci kako bi se odredio optimalan kod za svaki simbol.

Submit
85.
Mera za kolicinu informacije treba biti takva da monotono raste sa porastom verovatnoce poruke

Explanation

The statement suggests that the amount of information should increase monotonically with the probability of the message. However, this is not true. In information theory, the amount of information is inversely proportional to the probability of an event. The more probable an event is, the less information it carries. Therefore, the statement is incorrect.

Submit
86.
Izraz za izracunavanje kolicine informacije je:

Explanation

The expression I(x) = -log2 px is the correct formula for calculating the amount of information. It represents the negative logarithm of the probability of event x occurring, multiplied by the logarithm base 2. This formula is derived from Shannon's information theory, where the amount of information is inversely proportional to the probability of an event occurring.

Submit
87.
Racunarstvo je limitirano informaciom, dok su informacije racunarski limitirane

Explanation

The statement "Racunarstvo je limitirano informaciom, dok su informacije racunarski limitirane" translates to "Computer science is limited by information, while information is limited by computers." The correct answer "Tvrdnja nije tacna" means "The statement is not true." Therefore, the explanation is that the statement is false because computer science is not limited by information, and information is not limited by computers.

Submit
88.
Kada i zasto nesto upotrebiti definise:

Explanation

The question is asking about what defines when and why to use something. The concept of "mudrost" (wisdom) encompasses the ability to make sound judgments and decisions based on knowledge and experience. Wisdom is often considered to be a higher level of understanding and insight, which allows individuals to determine the appropriate time and reasons for utilizing something. Therefore, "mudrost" is the correct answer in this context.

Submit
89.
Kada se koristi logaritam sa osnovom tri:

Explanation

not-available-via-ai

Submit
90. Da li trenutno dekodivi kodovi zahtevaju memorisanje prispelih kodnih reci ili cekanje dospeca novih da bi dekodovanje bilo obavljeno?

Explanation

Currently decodable codes do not require storing received codewords or waiting for new ones to arrive in order for decoding to be performed.

Submit
91. H(X,Y)≤

Explanation

The given correct answer, H(X)+H(Y), suggests that the joint entropy of two random variables X and Y is less than or equal to the sum of their individual entropies. This means that the combined uncertainty or randomness in X and Y together is no more than the sum of their individual uncertainties. It implies that there is some correlation or relationship between X and Y, as the joint entropy is not equal to the sum of individual entropies.

Submit
92.
Ako imamo binarni izvor informacije, onda entropija binarnog izvora zavisi od:

Explanation

The entropy of a binary source depends on the probabilities of the messages. The more probable a message is, the lower its entropy will be. This is because the entropy measures the average amount of information needed to represent a message from the source, and if a message is highly probable, it requires less information to represent it. Therefore, the probabilities of the messages directly affect the entropy of the binary source.

Submit
93.
Entropija binarnog izvora definisana je izrazom:

Explanation

The given expression represents the entropy of a binary source. Entropy is a measure of the uncertainty or randomness in a source. In this case, the entropy is calculated using the probabilities of the two possible outcomes, p and (1-p). The expression -p log2 p represents the contribution to the entropy from the occurrence of the first outcome, while (1-p) log2 (1-p) represents the contribution from the occurrence of the second outcome. The negative sign indicates that entropy is always non-negative. Therefore, the correct answer is H= -p log2 p-(1-p) log2 (1-p).

Submit
94.
Deskriptivna kompleksnost je jednaka duzini:

Explanation

The explanation for the correct answer is that descriptive complexity refers to the amount of information needed to describe something. In this case, the question is asking about the length of the description. The correct answer, "Minimalnog opisa" (Minimal description), suggests that the length of the description is minimal, meaning it requires the least amount of information to describe something. This implies that the descriptive complexity is low, as it doesn't require a lengthy or detailed description.

Submit
95.
Mudrost odreduje:

Explanation

The given answer states that none of the options provided are correct for the question. However, since the question is incomplete and does not provide any context or information, it is not possible to determine the correct answer or provide a suitable explanation.

Submit
96.
Ako izvor generise 4 poruke verovatnoce ?, ukupna kolicina informacije iznosi:

Explanation

If a source generates 4 messages with equal probabilities, the total amount of information can be calculated by multiplying the number of messages by the number of bits needed to represent each message. In this case, since there are 4 messages and each message requires 2 bits to represent (log2(4) = 2), the total amount of information is 8 bits.

Submit
97. I(X;Y)=

Explanation

The given correct answer is H(X)-H(X|Y), H(Y)-H(Y|X). This answer suggests that the mutual information between two random variables X and Y can be calculated by subtracting the conditional entropy of X given Y from the entropy of X, and subtracting the conditional entropy of Y given X from the entropy of Y. This formula represents the amount of information that X and Y share, taking into account the uncertainty of each variable given the other.

Submit
98. Kraftova nejednakost nam daje informaciju:

Explanation

The given correct answer states that Kraftova nejednakost (Kraft's inequality) provides information about when there can be a unique prefix code. This means that when Kraftova nejednakost holds true, it is possible to have a code where no codeword is a prefix of another codeword, ensuring that the code can be uniquely decoded.

Submit
99. Uslovna entropija H(X,Y)=

Explanation

The correct answer is H(Y) + H(X|Y). This is because conditional entropy measures the amount of uncertainty in a random variable Y given the knowledge of another random variable X. H(Y) represents the entropy of Y, which measures the uncertainty of Y alone. H(X|Y) represents the conditional entropy of X given Y, which measures the remaining uncertainty of X after Y is known. Adding these two values together gives the total uncertainty in Y and the remaining uncertainty in X given Y.

Submit
100. Kada imamo kodovanje u cilju kvalitetnog prenosa u uslovima suma u kanalu:

Explanation

When we have coding for high-quality transmission in channel noise conditions, we add redundancy. Adding redundancy helps in error detection and correction, which improves the overall reliability of the transmission. By duplicating or adding extra bits to the original data, the receiver can identify and correct any errors that may have occurred during transmission. This ensures that the received data is as close to the original as possible, even in the presence of noise in the channel.

Submit
101.
Ako izvor generise 4 poruke verovatnoce ?, kolicina informacije po jednoj poruci iznosi:

Explanation

The correct answer is 2 bit-a. This is because the amount of information in a message is determined by the number of possible outcomes, which is equal to the logarithm base 2 of the number of outcomes. In this case, there are 4 possible outcomes, so the amount of information per message is log2(4) = 2 bit-a.

Submit
102. Kod jednog diskretnog informacionog izvora se naziva nesingularnim:

Explanation

The correct answer is when different source symbols correspond to different codewords. This means that each symbol from the source has a unique representation in the code.

Submit
103. Komprimovanje izvora informacije:

Explanation

Komprimovanje izvora informacije povećava entropiju simbola. Entropija se odnosi na mjeru nesigurnosti ili nepredvidljivosti informacije. Kada se informacija komprimuje, smanjuje se redundancija i povećava se raznolikost simbola. To dovodi do povećanja entropije simbola jer se povećava broj mogućih simbola i smanjuje se predvidljivost informacije.

Submit
104. Kodne reci su sekvence simbola iz kodnog alfabeta:

Explanation

The correct answer is "kodne reci su sekvence simbola iz kodnog alfabeta koji je u opstem slucaju razlicit od alfabeta poruka." This means that the code words are sequences of symbols from a code alphabet that is generally different from the alphabet of the messages. This implies that the code words are encoded using a different set of symbols than the original message, which is a common practice in coding and encryption.

Submit
105.
Entropija uniformne raspodele verovatnoca n mogucnosti:

Explanation

The correct answer states that entropy of a uniform probability distribution with n possibilities is equal to the measure of uncertainty of the corresponding possibility system without defined probabilities of possible outcomes. This means that when all possibilities have equal probabilities, the entropy is a measure of the uncertainty or lack of information in the system.

Submit
106.
Informacija se oblikuje analiziranjem veza i odnosa izmedu informacija.

Explanation

The given statement suggests that information is formed by analyzing the connections and relationships between pieces of information. It does not imply that information is formed by grouping data into a single entity. Therefore, the correct answer is "No."

Submit
107. Za bilo koja dva prefiksna koda nad istim informacionim izvorom, kod koji ima kracu ocekivanu duzinu ima manju entropiju simbola.

Explanation

This statement is incorrect. The expected length of a code is determined by the entropy of the symbols, not the length of the code itself. The entropy represents the average amount of information needed to encode each symbol. Therefore, it is possible for a longer code to have a lower entropy if it is more efficient in encoding the symbols.

Submit
View My Results

Quiz Review Timeline (Updated): Jul 22, 2024 +

Our quizzes are rigorously reviewed, monitored and continuously updated by our expert board to maintain accuracy, relevance, and timeliness.

  • Current Version
  • Jul 22, 2024
    Quiz Edited by
    ProProfs Editorial Team
  • Nov 10, 2013
    Quiz Created by
    Singi123
Cancel
  • All
    All (107)
  • Unanswered
    Unanswered ()
  • Answered
    Answered ()
Znanje se oblikuje ako:
Mudrost se oblikuje:
Entropija binarnog izvora je maksimalna kada je verovatnoca poruka:
Podatak je:
Razlika izmedu podataka i informacije:
Kod je kompletan kada:
Kada kodujemo poruku radi tajnosti:
Novine se mogu smatrati jednim diskretnim izvorom informacija, nad...
Potpuno n-arno stablo je:
Broj 17 je:
Informacija je:
Pravilna hijerarhija priramide znanja je:
Informacija :
Kod verovatnosnog n-arnog stable:
Medju jednoznacnim kodovima od posebnog znacaja su tzv.:
Imamo dve poruke i njihove kodne reci dobijene entropijskim...
Ako je podatku dodeljeno znacenje:
Opsteprihvaceno je misljenje:
U verovatnosnom n-arnom stablu zbirna verovatnoca svih listova je:
Prefiks kod je:
Poruka "Sada je 22h." je:
H(X|y) oznacava:
Informacija je:
Funkcija koja zadovoljava sve uslove za merenje kolicine informacije...
Razlika izmedu znanja i razumevanja je kao i izmedu ucenja i...
Za svaki n-arni prefiksni kod postoji barem jedno n-arno kodno stablo...
Informacija je:
Kada se koristi logaritam sa osnovom dva:
Entropija je primenljiva kao mera neodredjenosti za:
Osnovni cilj efikasnog kodovanja je kodovanje datog izvora informacija...
Problem koji je moguce resavati zahvaljujuci Senonovom konceptu mera...
Dimenzija alfabeta se naziva:
Za jedan kod se kaze da je ___________ako u njemu ni jedna kodna rec...
Ako je verovatnoca poruke veca:
I(X;Y)=I(Y;X)
Uloga kodera je:
Ako je entropija izvora manja od kapaciteta kanala:
Kod entropijskog kodovanja:
Kada listovi stabla ne korespondiraju ni jednoj kodnoj reci, takvi...
Informacija se sastoji od:
Znanje odreduje:
Neodredjenost zdruzenog sistema je:
Podatak dolazi u obliku neobradenih zapazanja i dimenzija.
Korenu verovatnosnog n-arnog stabla dodeljena je verovatnoca:
Ako je verovatnoca prve poruke jednaka p:
Senonova granica predstavlja:
H(X|E) oznacava:
Razlika izmedu znanja i razumevanja je kao i izmedu pisanja I citanja.
Sa turbo kodovima dobijamo:
Kodno stablo i prefiksni kod su:
H(XlY)≤
Kod diskretnog izvora informacija se naziva jednoznacno dekodivim:
Maksimalna vrednost entropije binarnog izvora je:
Redudansa je:
Osnovna šema kodovanja izvora (source coding):
Izvor je bez memorije ukoliko:
Sta je krajnja brzina komuniciranja za zadati komunikacioni...
Mera za kolicinu informacije treba biti takva da monotono pada sa...
Lokalna uzajamna informacija slucajne velicine Y i X govori o tome...
Uzajamna informacija uzima je mera  meduzavisnosti dve slucajne...
Generator poruka odnosno sekvence simbola je:
Ako je alfabet konacan:
Nacin upotrbe odreduje:
Za optimalno kodovanje poruke:
Kontekst odreduje:
Ako neko posalje dve poruke:
Entropija H za slucajno postavljanje figure na praznu sahovsku tablu...
Kodiranje je proces:
Laplasovim principom nedovoljnog razloga, koji tvrdi da ukoliko ne...
Verovatnosno n-arno stablo je n-arno stablo cijim cvorovima su...
Sta je razlog da uopste nesto kodujemo?
Do koje mere se neki skup podataka moze komprimovati?(Shannon)
Izvor informacija je stacionaran ukoliko:
Kapacitet kanala C definise:
Muzika ili govor poseduju kompleksnost ispod koje se signali ovih...
Racunarstvo je limitirano komunikacijom, dok su komunikcije racunarski...
Sa ukupno tri bita moguce je kodovati:
Entropija izvora H definise:
Kodovanje radi kompresije predstavlja:
Entropija izvora predstavlja:
Poruka A ima verovatnocu ?, a poruka B ?
U verovatnosnom n-arnom stablu, prosecna (ocekivana) vrednost dubine...
Entropija izvora predstavlja fundamentalno ogranicenje:
Entropijsko kodovanje je metod:
Mera za kolicinu informacije treba biti takva da monotono raste sa...
Izraz za izracunavanje kolicine informacije je:
Racunarstvo je limitirano informaciom, dok su informacije racunarski...
Kada i zasto nesto upotrebiti definise:
Kada se koristi logaritam sa osnovom tri:
Da li trenutno dekodivi kodovi zahtevaju memorisanje prispelih kodnih...
H(X,Y)≤
Ako imamo binarni izvor informacije, onda entropija binarnog izvora...
Entropija binarnog izvora definisana je izrazom:
Deskriptivna kompleksnost je jednaka duzini:
Mudrost odreduje:
Ako izvor generise 4 poruke verovatnoce ?, ukupna kolicina informacije...
I(X;Y)=
Kraftova nejednakost nam daje informaciju:
Uslovna entropija H(X,Y)=
Kada imamo kodovanje u cilju kvalitetnog prenosa u uslovima suma u...
Ako izvor generise 4 poruke verovatnoce ?, kolicina informacije po...
Kod jednog diskretnog informacionog izvora se naziva nesingularnim:
Komprimovanje izvora informacije:
Kodne reci su sekvence simbola iz kodnog alfabeta:
Entropija uniformne raspodele verovatnoca n mogucnosti:
Informacija se oblikuje analiziranjem veza i odnosa izmedu...
Za bilo koja dva prefiksna koda nad istim informacionim izvorom, kod...
Alert!

Advertisement