Questions tagged [markov-chains]
Markov chains are systems which transition from one state to another based only upon their current state. They are used widely in various statistical domains to generate sequences based upon probabilities.
markov-chains
573
questions
98
votes
6
answers
30k
views
Is a Markov chain the same as a finite state machine?
Is a finite state machine just an implementation of a Markov chain? What are the differences between the two?
75
votes
3
answers
58k
views
How do Markov Chain Chatbots work?
I was thinking of creating a chatbot using something like markov chains, but I'm not entirely sure how to get it to work. From what I understand, you create a table from data with a given word and ...
62
votes
5
answers
37k
views
What is the difference between markov chains and hidden markov model?
What is the difference between markov chain models and hidden markov model? I've read in Wikipedia, but couldn't understand the differences.
57
votes
2
answers
3k
views
Issues implementing the "Wave Collapse Function" algorithm in Python
In a nutshell:
My implementation of the Wave Collapse Function algorithm in Python 2.7 is flawed but I'm unable to identify where the problem is located. I would need help to find out what I'm ...
24
votes
5
answers
50k
views
Generating Markov transition matrix in Python
Imagine I have a series of 4 possible Markovian states (A, B, C, D):
X = [A, B, B, C, B, A, D, D, A, B, A, D, ....]
How can I generate a Markov transformation matrix using Python? The matrix must ...
23
votes
1
answer
4k
views
When to use a certain Reinforcement Learning algorithm?
I'm studying Reinforcement Learning and reading Sutton's book for a university course. Beside the classic PD, MC, TD and Q-Learning algorithms, I'm reading about policy gradient methods and genetic ...
20
votes
3
answers
8k
views
Using Markov chains (or something similar) to produce an IRC-bot
I tried google and found little that I could understand.
I understand Markov chains to a very basic level: It's a mathematical model that only depends on previous input to change states..so sort of a ...
20
votes
1
answer
4k
views
What is a chain in PyMC3?
I am learning PyMC3 for Bayesian modeling. You can create a model and sample with:
import pandas as pd
import pymc3 as pm
# obs is a DataFrame with a single column, containing
# the observed values ...
20
votes
3
answers
38k
views
Simple random english sentence generator [closed]
I need a simple random English sentence generator. I need to populate it with my own words, but it needs to be capable of making longer sentences that at least follow the rules of English, even if ...
20
votes
2
answers
5k
views
Can an author's unique "literary style" be used to identify him/her as the author of a text? [closed]
Let's imagine, I have two English language texts written by the same person.
Is it possible to apply some Markov chain algorithm to analyse each: create some kind of fingerprint based on statistical ...
18
votes
14
answers
8k
views
Any business examples of using Markov chains?
What business cases are there for using Markov chains? I've seen the sort of play area of a markov chain applied to someone's blog to write a fake post. I'd like some practical examples though? E.g. ...
16
votes
1
answer
2k
views
Decoding sequences in a GaussianHMM
I'm playing around with Hidden Markov Models for a stock market prediction problem. My data matrix contains various features for a particular security:
01-01-2001, .025, .012, .01
01-02-2001, -.005, -...
15
votes
4
answers
3k
views
Directed probability graph - algorithm to reduce cycles?
Consider a directed graph which is traversed from first node 1 to some final nodes (which have no more outgoing edges). Each edge in the graph has a probability associated with it. Summing up the ...
15
votes
2
answers
4k
views
How can I make a discrete state Markov model with pymc?
I am trying to figure out how to properly make a discrete state Markov chain model with pymc.
As an example (view in nbviewer), lets make a chain of length T=10 where the Markov state is binary, the ...
14
votes
2
answers
7k
views
Best way to calculate the fundamental matrix of an absorbing Markov Chain?
I have a very large absorbing Markov chain (scales to problem size -- from 10 states to millions) that is very sparse (most states can react to only 4 or 5 other states).
I need to calculate one row ...
12
votes
1
answer
3k
views
Simulating a Markov Chain with Neo4J
A Markov chain is composed of a set of states which can transition to other states with a certain probability.
A Markov chain can be easily represented in Neo4J by creating a node for each state, a ...
11
votes
2
answers
11k
views
iPython Notebook; Plotting transition diagrams
My question is dead simple.
Is there a package to plot state-transition or markov diagrams that look like any of the following? I am thinking it has to exist, but I simply can't find it!
I've ...
11
votes
1
answer
6k
views
Why should we use RNNs instead of Markov models?
Recently I stumbled across this article, and I was wondering what the difference between the results you would get from a recurrent neural net, like the ones described above, and a simple Markov chain ...
11
votes
1
answer
2k
views
What is the best/fastest way to construct a very large markov chain from simulation data?
I have written a C++ program that simulates a certain process I'm studying. It outputs discrete "states" each timestep of the simulation. For example:
a
b
c
b
c
b
would be the output of a simulation ...
10
votes
3
answers
384
views
Optimisation of recursive algorithm in Java
Background
I have an ordered set of data points stored as a TreeSet<DataPoint>. Each data point has a position and a Set of Event objects (HashSet<Event>).
There are 4 possible Event ...
9
votes
2
answers
11k
views
What are the differences between Monte Carlo and Markov chains techniques?
I want to develop RISK board game, which will include an AI for computer players. Moreovor, I read two articles, this and this, about it, and I realised that I must learn about Monte Carlo simulation ...
9
votes
2
answers
3k
views
How to test if a string contains gibberish?
I am making a registering form and because some will enter gibberish in the Secret Answer's input (I do that myself), I would like to test that value to see if it's likely to be a good answer. I have ...
9
votes
1
answer
1k
views
Efficient implementation of Markov Chains in julia
I want to simulate the movement of a random walker in a network as efficiently as possible. Below I show a toy model with the three approaches I have tried so far. I should note that in my original ...
9
votes
1
answer
4k
views
How to vectorize a random walk simulation in MATLAB
I am rewriting a Monte Carlo simulation model in MATLAB with an emphasis on readability. The model involves many particles, represented as (x,y,z), following a random walk over a small set of states ...
8
votes
1
answer
3k
views
Change the size of the arrowheads in a markov chain plot
I've plotted a markov chain in R, but I dislike the rather hugh arrowheads that the plot-function is plotting. Is there a way to make the heads smaller?
library( markovchain )
transition.matrix <-...
8
votes
3
answers
6k
views
Finding stationary distribution of a markov process given a transition probability matrix
There has been two threads related to this issue on Stack Overflow:
How can I obtain stationary distribution of a Markov Chain given a transition probability matrix describes what a transition ...
8
votes
1
answer
7k
views
Constructing a multi-order Markov chain transition matrix in Matlab
A first-order transition matrix of 6 states can be constructed very elegantly as follows
x = [1 6 1 6 4 4 4 3 1 2 2 3 4 5 4 5 2 6 2 6 2 6]; % the Markov chain
tm = full(sparse(x(1:end-1),x(2:end),1)...
8
votes
3
answers
3k
views
Graphical markov chain in javascript [closed]
I have a Markov chain that I would like to represent graphically in javascript. I need to represent the nodes, links, and transition probabilities. Perhaps something like one of these two diagrams:
...
7
votes
3
answers
18k
views
R library for discrete Markov chain simulation
I am looking for something like the 'msm' package, but for discrete Markov chains. For example, if I had a transition matrix defined as such
Pi <- matrix(c(1/3,1/3,1/3,
0,2/3,1/6,
2/3,0,1/2))
for ...
7
votes
1
answer
1k
views
How to know where to join by space in spaCy NLP output
I am using spaCys NLP model to work out the POS of input data so that the my Markov chains can be a bit more gramatically correct as with the example in the python markovify library found here. ...
7
votes
3
answers
1k
views
Algorithm for computing the plausibility of a function / Monte Carlo Method
I am writing a program that attempts to duplicate the algorithm discussed at the beginning of this article,
http://www-stat.stanford.edu/~cgates/PERSI/papers/MCMCRev.pdf
F is a function from char to ...
7
votes
1
answer
5k
views
understanding how to construct a higher order markov chain
Suppose I want to predict if a person is of class1=healthy or of class2= fever. I have a data set with the following domain: {normal,cold,dizzy}
The transition matrix would contain the probability of ...
7
votes
1
answer
704
views
Reconstructing now-famous 17-year-old's Markov-chain-based information-retrieval algorithm "Apodora"
While we were all twiddling our thumbs, a 17-year-old Canadian boy has apparently found an information retrieval algorithm that:
a) performs with twice the precision of the current, and widely-used ...
7
votes
1
answer
451
views
How to create a nested dictionary from existing dictionary with set and list of tuples
I have parsed a midi file, and I've successfully gotten a dictionary of notes broken up by instrument. An abbreviated example of this is note_dict below, truncated for the purposes of this question.
...
7
votes
1
answer
3k
views
How to visually animate Markov chains in Python?
I want to "visually" animate Markov chains like here : http://markov.yoriz.co.uk/ but using Python instead of html css and javascript.
I don't know if there is any library that makes this ...
6
votes
6
answers
14k
views
Building a Transition Matrix using words in Python/Numpy
Im trying to build a 3x3 transition matrix with this data
days=['rain', 'rain', 'rain', 'clouds', 'rain', 'sun', 'clouds', 'clouds',
'rain', 'sun', 'rain', 'rain', 'clouds', 'clouds', 'sun', 'sun',...
6
votes
2
answers
14k
views
R : function to generate a mixture distribution
I need to generate samples from a mixed distribution
40% samples come from Gaussian(mean=2,sd=8)
20% samples come from Cauchy(location=25,scale=2)
40% samples come from Gaussian(mean = 10, sd=6)
...
6
votes
1
answer
13k
views
Hidden test cases not passing for Google Foobar Challenge Doomsday Fuel [closed]
I'm working my way through the Google Foobar challenge and am now at the level 3 challenge Doomsday Fuel. The instructions are as follows:
Doomsday Fuel
Making fuel for the LAMBCHOP's reactor core ...
6
votes
1
answer
12k
views
Hidden markov model in MATLAB
I have 11 states, and a transition probability matrix, but I don't have emissions as my model is not hidden. It consists only of states (1,2,3, ..., 11)
I want to generate random states based on my ...
6
votes
2
answers
1k
views
How does MCMC help bayesian inference?
Literature says that the metropolis-hasting algorithm in MCMC is one of the most important algorithms developed last century and is revolutional. Literature also says that it is such development in ...
6
votes
2
answers
2k
views
probability of each terminal node in a directed graph
I have a directed graph G(V,E) and weight w(u,v).
In this graph weight w(u,v) represents how many times the node(v) has visited from node(u). for example(See this for a directed graph image):
1 ...
6
votes
4
answers
587
views
Generating a pseudo-natural phrase from a big integer in a reversible way
I have a large and "unique" integer (actually a SHA1 hash).
Note: While I'm talking here about SHA1 hashes, this is not a cryptography / security question! I'm not trying to break SHA1. Imagine a ...
6
votes
2
answers
2k
views
How to create paragraphs from markov chain output?
I would like to modify the script below so that it creates paragraphs out of a random number of the sentences generated by the script. In other words, concatenate a random number (like 1-5) of ...
5
votes
3
answers
2k
views
Convert text prediction script [Markov Chain] from javascript to python
i've been trying the last couple days to convert this js script to python code.
My implementation (blindfull cp mostly, some minor fixes here and there) so far:
import random
class markov:
...
5
votes
2
answers
3k
views
How do Markov Chains work and what is memorylessness?
How do Markov Chains work? I have read wikipedia for Markov Chain, But the thing I don't get is memorylessness. Memorylessness states that:
The next state depends only on the current state and ...
5
votes
2
answers
704
views
Markov Chains and decimal points in r?
I have plotted a markov chain from a matrix in r. However, I have numerous probabilities under 0.01, and thus my probability plot looks something like this:
I've been searching for hours and I can't ...
5
votes
4
answers
4k
views
Markov chain stationary distributions with scipy.sparse?
I have a Markov chain given as a large sparse scipy matrix A. (I've constructed the matrix in scipy.sparse.dok_matrix format, but converting to other ones or constructing it as csc_matrix are fine.)
...
5
votes
2
answers
3k
views
Estimating confidence intervals of a Markov transition matrix
I have a series of n=400 sequences of varying length containing the letters ACGTE.
For example, the probability of having C after A is:
and which can be calculated from the set of empirical sequences,...
5
votes
1
answer
5k
views
Creating a smart text generator
I'm doing this for fun (or as 4chan says "for teh lolz") and if I learn something on the way all the better. I took an AI course almost 2 years ago now and I really enjoyed it but I managed to forget ...
5
votes
1
answer
434
views
Difference Between J48 and Markov Chains
I am attempting to do some evaluation of the relative rates of different algorithms in the C# and F# realms using WekaSharp and one of the algorithms I was interested in was Markov Chains. I know Weka ...