Continually apply basic, simple fundamentals over a long period of time
Don't wanna be here? Send us removal request.
Text
Install opencv3 with CUDA support on a mac
Install from source (recommended):
sudo xcode-select --install # if not done already brew tap homebrew/science brew install cmake pkg-config jpeg libpng libtiff openexr eigen tbb cd ~/CppProjects/ git clone --depth 1 http://ift.tt/29XJGg7 git clone --depth 1 http://ift.tt/2apYYrH cd /Users/kaiyin/anaconda3/envs/tensorflow/lib/ ln -s libpython3.5m.dylib libpython3.5.dylib # install for python 3.5 # tensorflow is an anaconda python3.5 environment on my machine created for tensorflow export ENV_TENSORFLOW=/Users/kaiyin/anaconda3/envs/tensorflow export PREFIX=/opt/local export PY_DYLIB="$ENV_TENSORFLOW/lib/libpython3.5.dylib" export OPENCV_CONTRIB=~/CppProjects/opencv_contrib/modules export PY_INCLUDE="$ENV_TENSORFLOW/include/python3.5m" export PY_BINARY="$ENV_TENSORFLOW/bin/python3.5" cmake -D CMAKE_BUILD_TYPE=RELEASE \ -D CMAKE_INSTALL_PREFIX=$PREFIX \ -D OPENCV_EXTRA_MODULES_PATH="$OPENCV_CONTRIB" \ -D PYTHON3_LIBRARY="$PY_DYLIB" \ -D PYTHON3_INCLUDE_DIR="$PY_INCLUDE" \ -D PYTHON3_EXECUTABLE="$PY_BINARY" \ -D BUILD_opencv_python2=OFF \ -D BUILD_opencv_python3=ON \ -D INSTALL_PYTHON_EXAMPLES=ON \ -D INSTALL_C_EXAMPLES=OFF \ -D BUILD_EXAMPLES=ON .. make -j8 # use 8 jobs for compiling sudo make install cp $PREFIX/lib/python3.5/site-packages/cv2.cpython-35m-darwin.so $ENV_TENSORFLOW/lib/python3.5/site-packages # install for python 2.7 # tf27 is an anaconda python2.7 environment on my machine created for tensorflow export ENV_TENSORFLOW=/Users/kaiyin/anaconda3/envs/tf27 export PREFIX=/opt/local export PY_DYLIB="$ENV_TENSORFLOW/lib/libpython2.7.dylib" export OPENCV_CONTRIB=~/CppProjects/opencv_contrib/modules export PY_INCLUDE="$ENV_TENSORFLOW/include/python2.7" export PY_BINARY="$ENV_TENSORFLOW/bin/python2.7" cmake -D CMAKE_BUILD_TYPE=RELEASE \ -D CMAKE_INSTALL_PREFIX=$PREFIX \ -D OPENCV_EXTRA_MODULES_PATH="$OPENCV_CONTRIB" \ -D PYTHON2_LIBRARY="$PY_DYLIB" \ -D PYTHON2_INCLUDE_DIR="$PY_INCLUDE" \ -D PYTHON2_EXECUTABLE="$PY_BINARY" \ -D BUILD_opencv_python2=ON \ -D BUILD_opencv_python3=OFF \ -D INSTALL_PYTHON_EXAMPLES=ON \ -D INSTALL_C_EXAMPLES=OFF \ -D BUILD_EXAMPLES=ON .. make -j8 # use 8 jobs for compiling sudo make install cp $PREFIX/lib/python2.7/site-packages/cv2.so $ENV_TENSORFLOW/lib/python2.7/site-packages/
Verify your installation in python 2.7:
# source activate tf27 (tf27) kaiyin@kaiyins-mbp 21:11:12 | /opt/local/lib/python3.5/site-packages => ipython Python 2.7.13 |Continuum Analytics, Inc.| (default, Dec 20 2016, 23:05:08) Type "copyright", "credits" or "license" for more information. IPython 5.1.0 -- An enhanced Interactive Python. ? -> Introduction and overview of IPython's features. %quickref -> Quick reference. help -> Python's own help system. object? -> Details about 'object', use 'object??' for extra details. IPython profile: kaiyin In [1]: import cv2 c In [2]: cv2.__version__ Out[2]: '3.2.0-dev'
Verify your installation in python 3.5:
# source activate tensorflow (tensorflow) kaiyin@kaiyins-mbp 21:13:13 | /opt/local/lib/python3.5/site-packages => ipython Python 3.5.2 |Continuum Analytics, Inc.| (default, Jul 2 2016, 17:52:12) Type "copyright", "credits" or "license" for more information. IPython 5.1.0 -- An enhanced Interactive Python. ? -> Introduction and overview of IPython's features. %quickref -> Quick reference. help -> Python's own help system. object? -> Details about 'object', use 'object??' for extra details. IPython profile: kaiyin In [1]: import cv2; cv2.__version__ Out[1]: '3.2.0-dev'
from Blogger http://ift.tt/2jqnw79 via IFTTT
0 notes
Text
Examples of linear convolutional filters
The sharpening filter perhaps needs a bit explanation. Suppose the pixel under the center of the filter has value v = x+d, while all the pixels around have value v_a = x, then after filtering:
\begin{align*} v' &= \frac{17}{9}(x + d) - \frac{8}{9}x \\ &= x + \frac{17}{9}d \end{align*}
Obviously, when the difference between the pixel and its environment is zero, the filter will not have any effect, but when there is a difference, it’s amplified by a factor of 17/9. Therefore this is called a sharpening filter.
from Blogger http://ift.tt/2jANlTx via IFTTT
0 notes
Text
Dodge and burn
Dodging has the over-exposure effect, where light pixels tend to be pushed to white. Burning is the opposite, where dark pixels tend to be pushed to black.
import numpy as np import cv2 def dodge(image, mask): return cv2.divide(image, 255 - mask, scale=256) def burn(image, mask): return 255 - cv2.divide(255 - image, 255 - mask, scale=256) class PencilSketch: def __init__(self, width, height, bg_gray="pencilsketch_bg.jpg"): self.width = width self.height = height self.canvas = cv2.imread(bg_gray, cv2.CV_8UC1) if self.canvas is not None: self.canvas = cv2.resize(self.canvas, (width, height)) def render(self, img_rgb): img_gray = cv2.cvtColor(img_rgb, cv2.COLOR_RGB2GRAY) img_gray_inv = 255 - img_gray img_blur = cv2.GaussianBlur(img_gray_inv, (21, 21), 0, 0) img_blend = dodge(img_gray, img_blur) return cv2.cvtColor(img_blend, cv2.COLOR_GRAY2RGB) import matplotlib.pyplot as plt # img = plt.imread("/Users/kaiyin/PycharmProjects/opencv3blueprints/chapter01/tree.jpg") img_rgb = cv2.imread("/Users/kaiyin/PycharmProjects/opencv3blueprints/chapter01/tree.jpg", -1) img_gray = cv2.cvtColor(img_rgb, cv2.COLOR_RGB2GRAY) # print(img_rgb.shape) plt.imshow(img_rgb) plt.imshow(img_gray, cmap="gray") img_dodge = dodge(img_gray, img_gray) plt.imshow(img_dodge, cmap="gray") img_burn = burn(img_gray, img_gray) img_blur = cv2.GaussianBlur(img_gray, (21, 21), 3) plt.clf(); plt.imshow(img_blur, cmap="gray") # effect of dodging: pixels that are brighter than a certain threshold are pushed to 255 (except that 255 is degenerated into 0) plt.clf(); plt.scatter(img_gray.flatten(), img_dodge.flatten()) plt.clf(); plt.scatter(img_gray.flatten(), img_burn.flatten()) img_dodge1 = dodge(img_gray, img_blur) img_burn1 = burn(img_gray, img_blur) plt.clf(); plt.scatter(img_gray.flatten(), img_dodge1.flatten()) plt.clf(); plt.scatter(img_gray.flatten(), img_burn1.flatten()) plt.clf(); plt.imshow(img_dodge1, cmap="gray") plt.clf(); plt.imshow(img_burn1, cmap="gray")
from Blogger http://ift.tt/2jAycBD via IFTTT
0 notes
Text
Eigenfaces
In this post we’ll talk about the application of principle component analysis in face recognition.
Eigen vectors as directions of variation
Given a d-dimensional dataset A with M samples, each sample being a \sqrt{d} \times \sqrt{d} face photo, we would like to find out unit vectors in the R^d space along which the dataset varies the most around the mean \mu.
To simplify the matter let’s assume that the dataset has been standardized as x_i \leftarrow x_i - \mu.
Suppose we have such a unit vector u, then the projection of a sample x_i on u would be (x_i \cdot u) u, so the coeffecient is x_i \cdot u.
The variance along u:
\begin{align*} \text{Var}(u) &= \frac{1}{M}\sum u^T \underbrace{x_i}_{d \times 1} \underbrace{x_i^T}_{1 \times d} u \\ &= u^T \left[ \frac{1}{M}\sum \underbrace{x_i}_{d \times 1} \underbrace{x_i^T}_{1 \times d} \right] u \\ &= \underbrace{u^T}_{1 \times d} \underbrace{\Theta}_{d \times d} \underbrace{u}_{d \times 1} \end{align*}
where \Theta is the covariance matrix of the dataset.
To maximize Var(u), u needs to be the eigenvector that corresponds to the largest eigenvalue of \Theta.
Dimensionality trick
With large images d is going to be large, which poses a numericl difficult when you solve for the eigenvectors. There is neat trick to overcome this.
We have \underbrace{\Theta}_{d \times d} = A^T \underbrace{A}_{M \times d} and want to find out the eigenvectors of \Theta, considering d \gg M, we try to find the eigenvectors of AA^T first:
\begin{align*} \underbrace{AA^T}_{M\times M} v &= \lambda v \\ A^TAA^T v &= \lambda A^T v \\ \Theta \underbrace{A^T}_{d \times M} \underbrace{v}_{M \times 1} &= \lambda A^T v \\ \end{align*}
Thus we find an eigenvector of AA^T and transform it by A^T and get the eigenvector of A^TA.
Eigenfaces
The eigenvectors thus obtained above are also \sqrt{d} \times \sqrt{d} face photos:
Face reconstruction from eigenfaces
To reconstruct the face photos (approximately), do this:
A' = \underbrace{A}_{M \times d} \underbrace{U}_{d \times N} U^T
Where each columen in U is an eigenface and AU gives you the coeffecients. e
Face space
But we dont’ have to reconstruct the faces in order to analyze them, AU gives us a new dataset with its dimensionality reduced from R^d to R^N, the latter now called the face space.
Give a new sampel x, a simplified face recognition procedure would be:
Project into the face space: x \leftarrow xUU^T.
Find the most similar row for x in AU, the reduced trainding data (one nearest neighbor).
That’s it!
from Blogger http://ift.tt/2isIEd6 via IFTTT
0 notes
Text
Polar representation for lines
Let’s take an arbitrary point (x_0,y_0) in the xy-plane, and consider all the possible lines passing through it y_0 = mx_0 + b. Each of these lines can be represented as (m, b) subject to b = -x_0m + y_0 in the mb-plane. The mb-plane is called the Hough space:
The problem is when the line is vertical, m is infinite and completely off the chart, not a nice property if you ask me. The solution is to use polar representation of the lines instead of slope-intercept representation.
In the figure below, AB is the line we want represent with polar coordinates (\theta, d), where d is the distance from the origin to the line (CD) and \theta is angle between the x-axis and the normal vector of the line (\angle BCD).
Let E be an arbitrary point on AB, then CF = GE = x, CG = y, GH = y / \tan \theta, CH = y / \sin \theta, HD = d - y / \sin \theta, HE = \frac{d - y / \sin \theta}{\cos \theta}. Using the fact GH + HE = x, you can derive that x\cos \theta + y\sin \theta = d, the polar representation we were looking for.
from Blogger http://ift.tt/2i3oc2m via IFTTT
0 notes
Text
Category theory for programmers
Functor
Examples:
data Const c a = Const c instance Functor (Const c) where fmap f (Const c) = Const c data Identity a = Identity a instance Functor Identity where fmap f (Identity a) = Identity $ f a
The Const functor illustrated:
Function type as a functor
Given a function f :: a -> b, we can convert the infix op into prefix and get f :: fun a b, and fun a is a functor. (You don’t have to use a prefix op, but perhaps that’s easier to reason with. Suit yourself.)
In the illustration above, I use the f then g notation (more logical to me) instead of the usual g \circ f.
Bifunctor
Bifunctor illustrated:
From haskell documentation:
Formally, the class Bifunctor represents a bifunctor from Hask -> Hask.
Intuitively it is a bifunctor where both the first and second arguments are covariant.
You can define a Bifunctor by either defining bimap or by defining both first and second.
If you supply bimap, you should ensure that:
bimap id id ≡ id
If you supply first and second, ensure:
first id ≡ id second id ≡ id
If you supply both, you should also ensure:
bimap f g ≡ first f . second g
Examples of bifunctor:
instance Bifunctor Either where bimap f _ (Left a) = Left (f a) bimap _ g (Right b) = Right (g b) instance Bifunctor Const where bimap f _ (Const a) = Const (f a) instance Bifunctor (,) where bimap f g ~(a, b) = (f a, g b)
Disect Maybe as a Bifunctor
data Maybe1 a = Nothing1 | Just1 a -- equivalent to: Either () (Identity a) -- equivalent to: Either (Const () a) (Identity a) -- We know that both Const and Idenity are functors -- Maybe1 is therefore a bifunctor
All algebraic data types are functorial, and Functor instances can be automatically generated in Haskell:
{-# LANGUAGE DeriveFunctor #-} data Maybe2 a = Nothing2 | Just2 a deriving Functor
from Blogger http://ift.tt/2iDWvkz via IFTTT
0 notes
Text
Small experiment with dependent types in Swift 3
//: Playground - noun: a place where people can play import Cocoa // warmup exercise // get the types of variables using swift reflection let x = Mirror(reflecting: (1, 2, "e")) let y = Mirror(reflecting: (3, 4, "hello")) print(x.subjectType) // compare types print(x.subjectType == y.subjectType) class TypeT<T> { } // return value is a Mirror (Swift jargon for reflection of types) // this value depends on the value of the first param func value<T>(_ val: Int, blank: T) -> Mirror { if(val == 0) { // if val is zero, then return the type of blank return Mirror(reflecting: blank) } // otherwise, wrap the type of blank inside a TypeT and recurse. return value(val - 1, blank: TypeT<T>()) } let x1 = value(3, blank: 0) // Mirror for TypeT<TypeT<TypeT<Int>>>
from Blogger http://ift.tt/2hKyGq7 via IFTTT
0 notes
Text
Scala Enumerations
object EnumerationTypes extends App { object WeekDay extends Enumeration { type WeekDay = Value val Mon = Value("Mon") val Tue = Value("Tue") val Wed = Value("Wed") val Thu = Value("Thu") val Fri = Value("Fri") val Sat = Value("Sat") val Sun = Value("Sun") // Returns the value by name, if the name does not exist in the Enumeration, then return None def valueOf(name: String) = WeekDay.values.find(_.toString == name) } import WeekDay._ def isWorkingDay(d: WeekDay) = ! (d == Sat || d == Sun) val x = WeekDay.valueOf("Tue") WeekDay.values filter isWorkingDay foreach println }
from Blogger http://ift.tt/1Ru0zhF via IFTTT
0 notes
Text
Let it be
From Beatles:
LET IT BE (Beatles) G D Em Cmaj7 C6 When I find myself in times of trouble, Mother Mary comes to me G D C G/B Am G Speaking words of wisdom, let it be And in my hour of darkness, She is standing right in front of me Speaking words of wisdom, Let it be Em G/D C G Let it be, let it be, let it be, let it be G D C G/B Am G Whisper words of wisdom, let it be And when the broken hearted people, Living in the world agree There will be an answer, let it be But though they may be parted, There is still a chance that they may see There will be an answer, let it be CHORUS: Let it be, let it be, let it be, let it be | | There will be an answer, let it be | | Let it be, let it be, let it be, let it be | | Whisper words of wisdom, let it be LEAD Let it be, let it be, let it be, let it be Whisper words of wisdom, let it be And when the night is cloudy, There is still a light that shines on me Shine on till tomorrow, let it be I wake up to the sound of music, Mother Mary comes to me Speaking words of wisdom, let it be CHORUS chords in instrumental section and ending: C G/B Am G F C/E D C G
from Blogger http://ift.tt/1OCWtk9 via IFTTT
0 notes
Text
Firebase comment app in javascript
var firebaseData = new Firebase('http://ift.tt/1KdTomD'); var commentsDB = firebaseData.child("comments"); //commentsDB.transaction(function(currentval) { // return (currentval || 0) + 1; //}); var getEpoch = function() { return (new Date()).getTime(); } var epochToDate = function(epoch) { var d = new Date(0); d.setUTCMilliseconds(epoch); return d; } var handleCommentKeypress = function (e) { if (e.keyCode == 13) { var author = $("#author-field").val(); var comment = $("#comment-field").val(); if (author && comment) { var date = new Date(); date = date.toString(); commentsDB.push( {author: author, comment: comment, date: getEpoch()} ); } else { alert("Author and Comment are required fields!"); } } }; commentsDB.on("child_added", function (snap) { var entry = snap.val(); var entryLI = $("<li></li>").text( entry.author + ": " + entry.comment + " [ " + epochToDate(entry.date).toString() + " ] " ) $("#comments-list").append(entryLI); $("#comment-field").val(""); }) $("#comment-field").keypress(handleCommentKeypress) $("#author-field").keypress(handleCommentKeypress) var ref = new Firebase("http://ift.tt/17TxbOo"); ref.orderByChild("height").on("child_added", function (snapshot) { console.log(snapshot.key() + " was " + snapshot.val().height + " meters tall"); });
from Blogger http://ift.tt/1KdTrPn via IFTTT
0 notes
Text
Javascript variable hoisting
var type = 'Ring Tailed Lemur'; function Lemur() { console.log(type); var type = 'Ruffed Lemur'; } Lemur();
is translated into
var type = 'Ring Tailed Lemur'; function Lemur() { var type; console.log(type); var type = 'Ruffed Lemur'; } Lemur();
undefined is logged in this case.
from Blogger http://ift.tt/1RKub9G via IFTTT
0 notes
Text
Manipulate clipboard in Javascript
window.addEventListener('copy', function (ev) { console.log('copy event'); // you can set clipboard data here, e.g. ev.clipboardData.setData('text/plain', 'some text pushed to clipboard'); // you need to prevent default behaviour here, otherwise browser will overwrite your content with currently selected ev.preventDefault(); });
from Blogger http://ift.tt/1J1eI3J via IFTTT
0 notes
Text
Scala extractors
/** * Created by kaiyin on 1/10/16. */ object TestUnapply { case class Division(val number: Int) { // def unapply(divider: Int): Boolean = number % divider == 0 def unapply(divider: Int): Option[(Int, Int)] = if (number % divider == 0) Some(number/divider, 0) else None def unapply(divider: Double): Boolean = number % divider.toInt == 0 } object Division { def apply(number: Int) = new Division(number) } val divisionOf15 = Division(15) // y should be true val y = 5 match { // case DividedBy(15)() => true case divisionOf15(z, w) => s"$z, $w" case _ => s"Not divisible" } val z = 5.0 match { case divisionOf15() => "Divisible" case _ => "Not divisible" } }
from Blogger http://ift.tt/1ZXjWBF via IFTTT
0 notes
Text
Relation between join, fmap and bind
Recognizing the equivalence:
join (fmap f ma) = ma >>= f
Proof:
join (fmap f ma) fmap f ma >>= id -- join x = x >>= id (definition) (ma >>= return . f) >>= id -- fmap f xs = xs >>= return . f (1) ma >>= (\x -> (return . f) x >>= id) -- m >>= (\x -> k x >>= h) = (m >>= k) >>= h (2) ma >>= (\x -> return (f x) >>= id) -- (f . g) x = f (g x) (definition) ma >>= (\x -> id (f x)) -- return a >>= k = k a (3) ma >>= (\x -> f x) -- id x = x (definition) ma >>= f -- (eta reduction) (1) implied by monad laws (2) associativity monad law (3) left identity monad law
Credit to verement.
from Blogger http://ift.tt/1IVflfg via IFTTT
0 notes
Text
Monoid in scala
object M { trait Monoid[A] { def append(a1: A, a2: A): A def empty: A } object Monoid { implicit def ListMonoid[A]: Monoid[List[A]] = new Monoid[List[A]] { def append(a1: List[A], a2: List[A]) = a1 ::: a2 def empty = Nil } } def append[A](a1: A, a2: A)(implicit m: Monoid[A]): A = m.append(a1, a2) } import M._ implicitly[Monoid[List[Int]]].append(List(1, 2), List(3, 4)) append(List(1, 2), List(3, 4))
from Blogger http://ift.tt/1K3l8Ky via IFTTT
0 notes
Text
Riding the knight in haskell
Caluclate possible moves of a knight on a chess board:
import Control.Monad as M type KnightPos = (Int, Int) moveKnight :: KnightPos -> [KnightPos] moveKnight (c, r) = do (c', r') <- possibleMoves M.guard (c' `elem` [1..8] && r' `elem` [1..8]) return (c', r') where possibleMoves = [ \(a, b) (c, d) -> (c + a, d + b), \(a, b) (c, d) -> (c + a, d - b), \(a, b) (c, d) -> (c - a, d + b), \(a, b) (c, d) -> (c - a, d - b) ] <*> [(1, 2), (2, 1)] <*> [(c, r)]
from Blogger http://ift.tt/1kGvPvK via IFTTT
0 notes
Text
Introduction to analysis Rosenlicht, chapter 3: metric spaces
Definition of metric spaces
Definition. A metric space is a set E, together with a rule which associates with each pair p, q \in E a real number d \left( p, q \right) such that:
Proposition (Schwarz inequality): Cauchy–Schwarz inequality in vector form: \left| a \cdot b \right| \le \left| a \right| \left| b \right|
Corollary of Schwarz inequality:
Or in vector form: \left| a + b \right| \le \left| a \right| + \left| b \right|
Proposition: generalize the triangle inequality to multi-angle inequality:
Proposition: difference of two sides of a triangle is less than the third side.
Open and closed sets
Definition of open and closed balls:
Definition of open set:
Proposition: basic properties of metric spaces:
Proposition: an open ball is also an open set.
Definition of closed sets:
Proposition: a closed ball is also a closed set.
Definition of boundedness:
Proposition: a nonempty closed subset of R contains its extrema.
Definition of convergence:
Uniqueness of convergence:
Definition of subsequence:
Subsequence of a convergent sequence is also convergent:
Convergent sequences are bounded:
from Blogger http://ift.tt/1IYCFIH via IFTTT
0 notes