Tumgik
#its the best way to experience ml nowadays i think
aro-aizawa · 11 months
Text
im so glad that so far the fandom's ideas of shadybug and claw noir haven't been ripped to shreds im going lightheaded at how much im giggling
3 notes · View notes
tgbsupplementsinc · 4 years
Text
10 Reasons 1-Testosterone is a stimulating Steroid
AAS Analysis:
10 Reasons 1-Testosterone is a stimulating Steroid
 1-testosterone (dihydroboldenone), aka DHB, may be a steroid which has been growing in popularity recently but isn't new the bodybuilding world. DHB isn't a testosterone-based compound; it's merely the 5alpha reduced sort of Equipoise (boldenone). DHB acts during a much different way than standard Testosterone or EQ. DHB is that the dihydrotestosterone (DHT) version of equipoise like DHT is to Testosterone. 1-Test is one among the foremost potent present steroids to be isolated. From 2002-2005, 1-Testosterone was being sold as a supplement until being added as a category III drug in Jan. 2005. During this short 3 yrs, a number of these supplements sold with good success albeit the oral bioavailability isn't very high. Supplement companies were using 1-Testosterone in an oil solubilized softgels attached to an undecanoate ester (think Andriol/ Lymphatic delivery) also as Transdermal solutions/gels at the time. Both of those delivery methods have some positive effects but, the injectable preparations are quite simply, simpler. In most cases, DHB is attached to the cypionate ester. There are not any legit prescription DHB injectable preparations available, therefore the only options are all from underground labs (UGL) sources. Always use caution when using anything from UGLs. UGLs aren't regulated and are available with more risk for contamination. For this text, we are getting to focus mainly on 1-Testosterone Cypionate. Here are ten reasons why 1-testosterone may be a steroid that I find to be very interesting.
 1.) DHB has an anabolic to androgenic ratio of 200/100. As compared, testosterone's A: A ratio is strictly 100 (100:100), and Deca Durabolin is rated at 125: 37, so DHB is twice as anabolic as testosterone and almost twice as anabolic as Deca. However, let's not forget that rates given to anabolic androgenic steroids (AAS) are often misleading. DHB is extremely minimally androgenic. Side effects like aggression and vital sign increase are getting to be less likely to occur than with other steroids. Compared to other compounds, DHB is taken into account very mild, but this is often all relative to the dose used. 1-Testosterone doesn't cause significant stress on the kidneys or other organs. 1-Testosterone is understood to yield lean quality tissues gains with little to no bloat with low side effect potential.
2.) DHB is extremely anabolic and doesn't aromatize, which suggests it should yield nice lean muscle gains. DHB is more anabolic than testosterone, equipoise, and Deca Durabolin. Since there's no aromatization to estrogen, there's no got to worry about estrogenic side effects like gynecomastia or water retention. Beat all, DHB are often an excellent pre-contest hormone since it'll offer you minimal water retention. 1-Testosterone Cyp is great for helping to take care of muscle while on a contest diet, might be used for a lean bulk with relative ease by running it at moderate doses, or are often used with aromatizing compounds like Testosterone, Dianabol, or MENT for an all-out bulk when combined with a diet that reflects those goals.
 3.) Unfortunately, Injectable DHB is understood to return with prominent post injection pain (PIP). This happens after the injection, and that I believe this might be since 1-Testosterone is an irritating substance on its own. This will even be experienced with a number of the transdermal solutions. The PIP may cause many to avoid DHB and miss out on the potential benefits. From my experience, the PIP seems to vary greatly from person to person. Almost like how some have PIP with steroids attached to the propionate ester. PIP also can vary counting on which underground lab is manufacturing the answer. Most ordinarily you'll find DHB dosed at 100-150mg/ml. The 150mg/ml is typically getting to cause more PIP than the 100mg/ml. to assist reduce the PIP, try diluting the DHB with another steroid or sterile oil like Grapeseed oil (sterilized). Heating the oil with a hot pad before doing the injection can sometimes help also as doing all of your injection slowly. Using less volume with each injection can sometimes help take a number of the snap at also, so rather than doing 2ml on Mon. and Thurs. you'll try doing 1ml Mon., Tues., Thurs., and Friday.
  4.) Another potential negative drawback is that the volume of oil needed to offer the user a robust effect. 1-Testosterone is extremely almost like Primobolan when it involves the way they're dosed mg/ml and therefore the volume of oil that has got to be wont to reach the specified effect. Both DHB and Primo are typically dosed at 100mg/ml. Most users notice the simplest results once they recover from 400mg per wk. This suggests you're injecting an honest little bit of this painful oil into your muscles weekly. For many, this may prevent them from using this steroid.
 5.) Curiously enough , DHB has some thermogenic properties, almost like trenbolone, mainly regarding sweating and in some cases insomnia. DHB gives very nice strength gains without hurting appetite so it is often a really nice bulker. Although it's going to share its name with equipoise or testosterone, users consider DHB more as trenbolone's baby brother. I personally desire this is often overstating the facility of DHB. Yes, it's an excellent anabolic, and that i desire it's a stronger compound than Primobolan but yet definitely milder than tren. The PIP makes it somewhat difficult to use in most cases, but I even have personally had 1-Test Cyp that had little or no PIP a time or two, and in those cases, it wasn't an inconvenience in the least to use. Meager side effects overall aside from the potential injection site pain and a small increase in blood heat. If you'll find a 1-Testosterone Cyp with little PIP that's from a trusted source, it's worth using both within the offseason and in contest prep counting on your diet and your goals. This compound could easily be implemented in either situation with great success. a bit like all steroids, DHB is suppressive, so a radical post cycle therapy (PCT) should be implemented unless you're on Testosterone replacement.
 6.) 1-Testosterone is one among the few steroids which will be taken in oral form, transdermal solution, or in an injectable preparation. As stated above to experience the foremost benefits the injectable preparation is best except for some, the transdermal 1-Testosterone are often a pleasant addition or are often utilized by women in lower doses. Here is what a number of the standard doses were when DHB was being sold as a supplement before being placed on the list as a schedule III drug. When it had been dosed orally and packaged in an oily solution in gel cap form the oral dosing was 100-250mg daily. Transdermally, 75-100mg was applied daily. I personally enjoyed the transdermal version and ran it up to 200mg daily this version is hard to return by nowadays. Starter doses with the injectable cypionate version would be 100-200 mg added to TRT dose of test. Confine mind this is often a really mild dose, and personally, I felt just subtle changes from this dose. If you won’t to running only 1-200mg / wk of Testosterone for TRT then added in 200mg of DHB, you'd probably notice a touch effect, except for a bodybuilder running much higher doses, this is able to probably get to be doubled or tripled to note truth potential positive anabolic effects of this compound. Women have used this steroid at low doses like 25mg daily of the orally. Typically, the oral dosages are prepared much higher per cap. Injectable dosing for ladies is around 10-20mg per wk. Any use of this compound by a female could still very easily because masculinizing effects.
 7.) When DHB is combined with moderate to higher doses of testosterone an AI should be used. This is often because 1-Testosterone features a high binding affinity to the androgen receptor which suggests there's an honest chance more testosterone goes to be displaced. This might very likely increase estrogen and free testosterone to above normal levels. For instance, for instance a bodybuilder.
 is using around 400-500 milligrams (mg) per week of DHB with long ester testosterone dosed at 4-600mg/wk. Some guys may use 4-600mg of Testosterone by itself with no AI (I wouldn't recommend this, but some people escape with it) but when the 1-Testosterone is employed in conjunction with an equivalent dose of testosterone more of the testosterone are often aromatized which yields a better estrogen level.
 8.) DHB does aromatize but only at a really low level. It’s not fully understood how. It’s speculated that the body could also be potentially inserting a covalent bond on its own at carbon-4. Even still estrogen related sides are usually very mild to non-existent with this compound by itself. If employed by itself or stacked with non-aromatizing agents, it shouldn't cause noticeable estrogenic side effects, like lethargy, low libido, or depressed mood. For this reason, it's an honest idea to stack some testosterone with it. Males got to have estrogen to function optimally and by keeping a minimum of a replacement dose of Testosterone in with the DHB cycle should accomplish this. The added testosterone will give the cycle another androgenic kick. The side effects aren't bad in comparison to stronger compounds like trenbolone, and are probably more in line with the side effects which will arise with Equipoise and/or Primobolan. As stated above, support supplements, blood work, and a full post cycle therapy (PCT) should be used in the least times.
 9.) 1-Testosterone may be a Dihydrotestosterone (DHT) derived steroid. I even have heard a couple of people say it's not a Testosterone base, DHT base, or 19-Nor base but something different. This is often not correct. 1-Testosterone is what DHT is to Testosterone to Boldenone (EQ). Boldenone may be a Testosterone Derived Steroid and Dihydroboldenone is that the DHT derivative of Boldenone. This is often also why it's the classic DHT type results, Hardening, Strength, little to no water retention, low to no estrogenic side effects.
10.) 1-Testosterone is often an irritating substance to the skin. This has been known to not only cause some irritation within the transdermal formulas but also the PIP which most realize it for. One other thing that I found to be interesting was that some users may experience a small burning during urination from this compound. I personally noticed I felt like I had to urinate more frequently while on 1-Test Cyp once I got over 5-600mg/wk this is often also where I noticed the slight stinging or burning sensation. It had been very mild and honestly had I not been really listening to the small details I’ll have overlooked this.
  In closing, my personal opinion on 1-Testosterone is that it is often a superb tool for bodybuilders who don't mind doing frequent, voluminous injections. It works great during a cut phase also as a lean mass building phase. Overall the side effects are relatively mild but to not be taken without any consideration. The anabolic effects are significant if you'll tolerate moderate or higher doses of this anabolic. 400mg+ is where you actually start to ascertain some changes. Do I feel it compares to Tren? No I don’t, but I do desire it works better than both Primobolan and EQ mg per mg, but EQ is simpler to use at higher mgs. The PIP is often severe in some cases and varies from person to person and batch to batch. I hope you guys enjoyed this text to shut I’m getting to offer you a couple of example setups. Here are a couple of hypothetical scenarios, this is often an example of how i might found out differing types of cycles with DHB after having used 1-Test Cypionate quite a couple of times.
  Example 1: 1-Test Cyp as a part of a cutting stack would look something like this. For theoretical purposes, here is what a cut stack may appear as if while utilizing 1-Test Cyp for a male bodybuilder : 300mg of Test prop per wk (100mg on Mon, Wed, Fri), 3-450mg of 1-Test Cyp(100+150mg on Mon, Wed, Fri), with 300mg of Tren ace/wk (100mg Mon, Wed, Fri) and perhaps a lower dose of oral Winstrol or Anavar (25-50mg) on training days only. Ancillaries: Exemestane 12.5mg on Mon, Wed, Fri.
  Example 2: 1-Test Cyp as a part of a bulking cycle would look something like this. 600mg of 1-Test Cyp (200mg Mon, Wed, Fri), 600mg of Test E or C (200mg Mon, Wed, Fri), 300mg of NPP (150mg Tues and Thurs), 25mg of Dianabol on training days.
GH- 2 IU pre-workout / 2 IU post workout
Ancillaries: Exemestane 12.5mg a day.
  LLEWELLYN, W. (2017). ANABOLICS (11th ed.). S.l.: MOLECULAR NUTRITION.
1-Testosterone (dihydroboldenone)
5 notes · View notes
ghostofvixx · 5 years
Text
First Disgrace- ML
Asbolo was a centaur that could predict if something good or bad was going to happen based on the flight of birds. Mark, as his descendant, lives a quiet life until, by chance, he meets Pandora’s box guardian.
Words: 4k
Genre: fluff
Warnings: Cringey at times?? + It’ts not exactly just like greek mythology, I’ve made up some things sorry ksks. Idk what this is tbh, but I felt like doing one of these projects and BAM Mark happened, so I hope you enjoy it uwu.
Tumblr media
Mark is guided by signs, therefore, he’s not an Oracle. 
To be a descendant from Asbolo sometime’s is fun, but people tend to confuse terms and end up coming to him to ask things about their future.
Like at this exact moment.
“Mark, can’t you try it even once?" Chenle pouts. "I'm tired of being lonely."
"You're an elf, can't you just focus on your job?"
"How can I focus on my job when I'm a lonely elf?"
"Just... Okay, forget it." Mark rolls his eyes and looks at his friend. "Look, I can't tell you anything about your future, I can't see it, you should ask the Oracle instead."
"Do you think the Oracle will listen to me?"
"I don't know how busy they are, but they can give you an answer at least."
"Okay, goodbye buddie, have a nice day." Chenle cheers up a little and leaves Mark's house. He sighs once again.
Why can't people understand he's not an Oracle?
Mark's power is way much different, maybe not as powerful, but still very useful. If there's a white pigeon, he knows that something good is going to happen. However, if the bird he sees is a black crow, that means disgrace. Sometimes, by instinct, he knows what is going to happen and to whom. Some others, it's very difficult to crack the code and he can't tell exactly. Maybe, if he's feeling lucky, he knows what to do: if he should follow the bird, if he shouldn't because it may be dangerous, etc.
Asbolo was a centaur, one of the wisest ones. He could tell destiny only studying the bird's flight, just like Mark does. How he died remains being a mystery that happened twenty years ago. Right after that and since the world needed to keep its stability, Mark was born as the one who would replace the great Asbolo.
In spite of being well known because of his power and having guests every day, Mark considers his life boring. Through all his life there hasn't been a single disgrace, he has seen some, but nothing serious. Why does he have that power if nothing is going to happen? All he does is smiling when he sees a white pigeon and has a good feeling, and get worried when he sees a black crow.
And, like that, his day goes on. He takes a walk to see his friends, he visits Chenle once again and cheers him up, they talk a little about Jisung and Haechan since they haven't seen them in a while, he gets a few things to cook and then comes back. His days are always as boring. When he gets to his small house, he tries to cook, as always, when he sees it.
A white pigeon.
He smiles, wondering who will that white pigeon make happy that day. Then, the pigeon flies away and Mark feels his curiosity building up.
He begins to walk faster and faster until his feet gain speed on their own, and he finds himself running after it. His little town gets lost and suddenly he only sees a big field and mountains. He doesn’t know where he is, but that doesn’t stop him from following that white pigeon into the mountains, where the relief gets more uneven and it’s hard to orient yourself.
He doesn’t feel tired in spite of running such a long way, in fact, he doesn’t feel nothing at all. He doesn’t come to his senses until the white animal finally stops flying and lands in front of what Mark thinks it’s the entrance of a cave. He turns around to look at the small animal, as if it could have the answers, but it was no longer there, so he supposes that he has to get inside.
It’s just the way he thought it would be: dark, humid, very cold.
“I’m Asbolo’s descendant, what am I doing here?” He whispers to himself, not expecting to listen to his voice as loud as if he had just said it out loud. Suddenly, he hears some weird noises coming from deep inside the cave.
“Is anybody here?” He asks, knowing he probably wouldn’t get an answer.
But he does.
“What are you doing here?” A feminine voice asks rudely. The voice makes Mark open his eyes widely as he feels his hair standing on. He doesn’t answer, too scared to talk. “I asked you a question!” The voice says again, angrily.
“I- I got lost and somehow ended up h-here, I‘m s-so sor-ry.” He apologises when he finds his voice.
“Did Renjun guide you here?” The voice asks, this time more calmly.
“Wh-which Renjun? My Renjun?"
“I don't know who your Renjun is, he's my guardian angel.”
It's his friend then.
He thinks of the white pigeon that has brought him there. Angels have the ability to turn themselves into whatever they want to, could it be him?
“I-I don’t really know, I just followed the pigeon and ended up here, may I know who am I talking to?“
The voice sighs deeply.
“I guess Renjun sent you here, so why not. I’m the guardian of Pandora’s box, the box that contains all the evil things that could deeply hurt both, the human’s and the god’s world.” Mark then pictures an old woman and that makes him feel more relaxed. He has been told a lot of time how the guardian looks like."Who are you?”
“I’m Mark, Asbolo’s descendant.”
“Nice to meet you, Mark.” The woman takes a few steps to the front and, in spite of the low light, he can see her perfectly.
She’s not the way he had been told.
A young, beautiful girl comes to his encounter. She can’t be older than him, he supposes she might have around her age.
“N-nice to meet you eh-”
“Y/n.”
“S-sure, y/n.” Mark is beyond embarrassed, he wasn’t expecting to find such a young person looking after something so important.
“I don’t know if Renjun is the one who brought you here, Mark, but I guess you don’t want to be here, nobody would. I suggest you to go back to the place you live.”
And if you're surprised to find out that someone has gotten into your cave for no apparent reason, Mark is even more surprised. He takes a quick bow, showing his respects to such an important person in the god’s world, and turns around to leave. He doesn’t know how, but he will find his way back home and forget all this mess happened.
The following day he is still confused. It is his first interesting experience ever, so he isn’t sure how to react.
“What does it really mean?” He whispers to himself. “Did Renjun really wanted me to look after her? But why would he?“
He knows the angel, he considers him a friend, but since he’s a guardian angel, he’s always in some type of mission or is too busy to hang out with his friends. He would have warned him about something like this, even if they barely talk.
He shakes his head and puts on his clothes, then walks down the stairs to go to the kitchen and have breakfast, exactly the same thing as yesterday. But then, when he looks through the window, he finds a white pigeon
He ignores it, maybe that's the best thing to do, right? But when he doesn't even look again to the white pigeon, the animal gets angry for no reason and starts to chip Mark's shoulder, without hurting him.
"What?" He asks. "What are you doing here again?"
The animal ignores him as it continues to chip the boy's shoulder. Mark sighs once again.
"You're here to take me there, right?" He asks the pigeon. "If anyone comes here and finds me talking to an animal, I'm sure they'll think I'm crazy instead of wise." He knows he shouldn't listen to whoever is behind the animal's behaviour, but he does in anyways. Only because it won't go away and the constant attempts to catch his attention are getting more and more annoying.
So he finds himself running again, behind the small creature. This time, he knows he has memorized the way there.
"You're here again?" He hears when he comes into the deepest part of the cave and sees you sitting with your legs crossed right in front of a small box, looking at it.
"I don't know why I'm here either." He answers.
"Well, I guess I have no choice but to listen to Renjun." You sigh. "I told him I could handle this on my own, but apparently he thinks I'm too weak."
"Where is he?" Mark asks, feeling curious about his friend's location.
"He's acting as some demigod's guardian angel. Apparently that boy needs a miracle or else he would retire." You explain.
"Can you retire from being a demigod?"
"No, but you can always give up." You shrug. Mark doesn't think he's going to have a real conversation because you won't even dare to look away from the box.
"Do you have to be looking at that box all the time?"
"Are you aware of what would happen if I don't?" You snap.
"Well, I guess you have a point." He takes a seat right beside you.
"Don't you dare to talk to me, stupid."
"There's no need to insult!" He defends himself. "Besides, what else can I do? I have to find out why Renjun sent me, we barely talk nowadays because he's always busy, now I know why though. Still, why would it be me?"
You stay silent, but don't stop the boy from rambling all day long about this and that, about his friends, about traditions and myths. You've been alone your entire life, only Renjun was there to help you and make you company, and he's not the most cheerful of the angels, he loves to nag and tease you but still, you love him. However, now that there's another person... You don't know how you feels, but it's somewhere between "can this boy just shut up" and a "well, it doesn't feels as bad."
Any of you realise how the pigeon's wings, as white as sugar, slowly turn black.
***
Days went through, Renjun was nowhere to be seen, but any of you cared as much as before. Mark didn't even need the white pigeon anymore, he would just go by your side by instinct, he would talk to you about anything and wait for you to slowly tell him about yourself and your life.
He wasn't successful biggest part of the time, but sometimes you would open up.
"So, how did you meet Renjun?"
"Is it really important, Mark?" You answer tiredly, knowing that the boy won't ever give up.
"Come on, you know my entire life, tell me something about you!" He cheers.
You sigh.
"I know your entire life because you won't stop talking." You tease. "But anyways, there's nothing interesting to tell. I've been locked here ever since I was born."
"And still you're such an interesting person, how do you do it?" He whispers out of the blue.
"What?" You pretend you haven't heard him.
"Nothing, I was just thinking random things." He blushes. "Back to the point, how did you meet Renjun?"
And maybe it's because nobody has really ever wanted to get to know you, and that makes you have a soft spot for that adorable blonde boy, or maybe because you finally have someone to talk to apart from Renjun and that makes you feel excited, but you end up explaining it to him anyways.
"I've always lived here, since I was born, so from a young age I've been aware of the importance of this box." You look at the small object in front of you, as you always do. "Since it's an important job and nobody wants me to be as irresponsable as Pandora was, I need supervision. A lot of different creatures offered, but Renjun was the only one who could fit what I really needed."
"Why?"
"He became my friend. In spite of nagging me a lot, he always told me about how's the world outside from here and warned me about everything, he takes good care of me even when he has to be someone else's guardian angel." You ramble.
"I have a theory." Mark comments.
"Huh?"
"You like him." You slap his arm strongly and he complains.
"Drop that theory, he's like my little brother." You scoff.
Because actually truth is that ever since you learnt about love, you think you may like the boy you had always seen like a brother, but you don't want Mark to know.
"Don't worry, I understand it." He comments, but you don't add anything else. "He may be listening to us though."
"That's right, the pigeon may be him." You comment sarcastically. "Is it here though?"
"To be honest it's been a few days from the last time I saw it, I've memorized my way here."
"That's brave, you're the only one who could."
"It's my first adventure, I have to." He smiles and puts a hand on your shoulder. You freeze.
Your heart feels warm by that gesture.
"O-okay."
Mark stand up and makes his way outside the cave when he stops.
"Look! It's here!"
"What?"
"The pigeon, it's here." He specifies. Suddenly, he frowns.
"What's wrong?"
"It's wings and a part of his head are black."
"And?"
"It wasn't like that before, and I don't have a good feeling about this."
"Make your way to your house fast and safe, then." You warn him without thinking about it. You think you may have sounded creepy, but he smiles widely, and your hear stops.
"I will! See you tomorrow!"
Mark is dangerous.
No, actually, the way you keep forgetting Renjun and your heart jumps at the thought of Mark is.
***
“Then, if you see a white pigeon you always have a good feeling?” You ask him, making some small talk while both of you have your eyes fixed on the box, just like you've been doing these past months since your first encounter.
“Not always. Pigeons in general, but overall white pigeons, represent good things, but for me to have a good feeling about them, there have to be more factors that just the animal.”
“Such as?”
“The way it flights, the direction, if it approaches any creature living in this world or stays away. Not all pigeons have good intentions, or intentions at all, that’s why it’s important to study them very well.” 
You stay quiet, because you have no words. You’re fascinated about him, he’s nothing compared to the clumsy boy you once met. In your eyes, he has completely changed. But you know that even if you accepted that you have feelings for him, you wouldn’t confess, because he doesn’t like you back.
How do you know? Well, it’s easy. He has a life outside the cave. When Renjun comes back from wherever he is, Mark will return to his house and forget all this happened, he would probably end up forgetting the hermit he met in his first adventure as Asbolo’s descendant.
But you know your feelings will remain intact.
“It’s not fun if you don’t give me a reaction.” You chuckle at the boy’s comment.
“It’s so cool Mark, I wish I could do something like that. I admire you a lot.”
But you obviously don’t know how crazy Mark’s heart beats at your words too.
“Th-thank you y/n. I actually admire a lot what you do, it must be so hard.” 
Yes, it is, when I can’t give you what I want to give you.
“Anyways.” He continues. “Only the great Asbolo can do it the proper way, I’m still learning by interpreting what he left written in books, but I can give you some tricks to know the basics.”
“Okay.” You agree. Any bird would get inside a cave, so you know you will never use it, but it still sounds interesting.
“You may not know what is going to happen and to whom, but if you see a pigeon then it will deliver a good message, as I’ve said before. However, if you see a crow, then is totally the opposite.”
“Bad news?”
“Yes. Crows usually mean death or bad signs, so if you see a crow, then a disgrace will happen, to anyone from this world or the human's world. That's why there are so many pigeon as well as crows."
You nod in understanding.
“I don’t think I’ll ever use it, but thank you though.” Now that you’ve grow some confidence in him, you feel brave enough to ask about more things. “Have you contacted Renjun?”
“No.” He answers. “But maybe I should convince a pigeon to do so, we have a lot to talk about, don’t we?”
“Absolutely. He has some explaining to do.” You smile, but it fades away when another question crosses your mind. “Do you regret wasting your time here?”
“Absolutely not.” Mark says, his heart beating as fast as yours. "How can I regret being with you?"
***
Mark did as promised, he sent his friend Renjun a letter explaining the situation. 
From there, the first disgrace happened.
“Mark, I’m glad to hear that you are taking good care of y/n when I’m not there, but you should know I’m not the person who sent that pigeon, so I know nothing about all that. My adventure with this demigod called Jeno might end up soon, so please, wait for me.
Keep looking after y/n too, please.”
“But if Renjun is not the reason behind that theory, what is it?” He thought out loud. Then, all the suspicious things he had noticed -such as why would a bird guide him to Pandora’s box guardian, or why would a white pigeon have black stains- were slowly lining up.
And all that he could think of you being in danger. Not even the cursed box was important when you could be in danger by just looking after it, so he made the way he had already memorised, his feet were once again moving on their own just like that day, but this time they knew exactly where to go.
On the other side, you began your morning just as usual. You didn’t usually sleep at night, and had never ever felt the need to, you were born with just one purpose and it was to never lose the box of your sight, but lately you had been feeling more tired, distracted, and even caught yourself daydreaming about how would your life be if you weren’t doing this. By his side, of course.
You were too busy on your own thoughts when you heard something that made you freeze in fear. A cawing. What was a crow doing in a cave if they didn't get inside caves?
"What is someone is in danger?" You question yourself. If this bird has approached you out of all the people in both worlds then the only person that could be in danger is... Him. "Mark's in danger!" You shouted, scared.
Then you did the most stupid thing you’ve ever done.
You ran away, not even caring about the box, or that your life was in danger, and everyone else’s, you just had to. If a crow had found your cave and Asbolo and his descendant say it means no good, then it can’t be good.
You had been running without knowing where to go for a few minutes when, by chance, you saw him. His eyes widened and he ran towards you. You did the same.
“Are you okay y/n?” His eyes scanned you looking for a wound as he held you tightly
“I-I don’t k.know. I s-saw a crow in m-my c-cave and I thought you may be hurt and I was so scared." You tried to explain.
"I'm fine, don't worry, I'm here." His words made you feel finally at ease, you hugged him without thinking. It was nice to know he wasn't hurt.
He did it back, as tightly as he could.
You stayed like that for a while, until you tried to break the hug.
"Don't." He said. "Let me stay like that a little longer you don't know for how long I've been wanting to do this."
And you didn't know.
Maybe he liked you back?
"I'm glad you're not hurt, seriously, I was on my way to see you." You decide to say to break the silence.
"You were going to check if I was alright? Were you that worried about me that you were going to look for me when you don't know where I live?"
"Stop teasing! You already know the answer." You broke the hug, successfully this time, pretending to be mad.
You just weren't expecting him to press his lips softly against yours in a sweet kiss. You didn't move until you were able to register what was happening. Then, you kissed him back, feeling his smile growing as you did.
Everything was going great, until something interrupted you. A cold, black fog spread through the field you were in, making it almost impossible to know what was right in front of you, black clouds appeared in the sky, blocking the sun. But the creepiest thing was what you could listen to: all types of creatures, from your world to human's world, shouting in fear as the ugliest creatures to ever exist roared.
Then everything went quiet and everything went back to normal. But that silence was the worst, it meant that the biggest disaster had taken place.
"Pandora's box has been opened." You say, terrified.
Mark looked as terrified as you, he was unable to move.
"B-by who?"
"I don't know Mark, but I wasn't there to stop it from happening."
***
Everything happened so fast.
Pandora's box had been opened. In a matter of seconds everyone knew what had happened. The gods called you, you knew you were going to be killed when you stepped on The Olimpus. In that moment, you could only think of Renjun and Mark and how much suffering you had caused to them.
"You didn't do your job properly." Athena, the goddess that had always supported you knowing how hard your job was and that had always given you the best advices, the goddess that introduced you to Renjun, was looking at you disappointed.
"I know, I deserve to be punished." You bowed, giving up.
"You do deserve to be punished." She answered solemnly. "But we suspect that there's something behind all this situation. A crow opened the box because it had been ordered to do so. We've also discovered that it had gone through a process of metamorphosis. At first it was a pigeon. Probably it was created to distract you and someone else from your initial job."
You shivered.
"That's why all of us have decided that you will live until everything is fixed, since you may be helpful. By the moment, you are exiled and you can't come back to the god's world unless we need you."
Not seeing Mark and Renjun again, but being alive to suffer the pain was just as bad as being dead. And to only be able to say goodbye to one of them only was even harder.
"I trust you Y/n, but it's for the best." Athena said, then she turned around and left.
You found Mark when you left The Olimpus, he already had been communicated your sentence.
"Renjun is nowhere to be seen." Mark read your mind.
"I hope he's doing fine. Tell him I say goodbye if you ever see him."
"I will." He remained quiet until you turned around and made your way to leave. "Wait y/N."
You stopped abruptly.
"Let me go with you." He suggested.
"Are you crazy? Mark this is not-"
"Look, I know." He interrupted you and held your hand. "This is a mess, this whole situation is, but I'm sure that you will eventually figure out what to do because you know that box better than anyone else. Let me go with you, we'll think together."
You were touched by his words, but you didn't want him to go through that.
"Mark you belong here, you can't just leave."
"I'll hide well, I promise!" You remained cold. "Look, you don't have to take the blame all by yourself, let me be with you, please."
You considered it.
But inside your head, everything that was happening could only be solved if you were with him.
"We will find a solution together, okay?" You answered and Mark kissed the top of your head. It was a difficult situation, but you knew that at least you could count on him.
53 notes · View notes
beardyallen · 6 years
Text
Bad news, guys...
Alright, so I’ve decided that, seeing as I’ll be visiting W-Town and the Great Wall again in May (when it will be waaaaay prettier), I’ll just do a post about it then.
Suffice it to say, it was a dope trip.
HOWEVER!!!!! I’ll tell y’all about my time since. The major highlight since the W-Town trip was obviously St. Patrick’s Day. I was somewhat nervous, given that most of the people I’ve met here probably wouldn’t want to celebrate the way that my family (which is way better at St. Patrick’s Day than your family, thank-you-very-much) celebrate.
There were no green alligators or long-necked geese, and that bleeding pub didn’t catch fire. Certainly not 12 times!! I suppose I still saw the same number of unicorns as usual, but I think I would have had bigger problems if there more.
My plan for that day was to make it to Paddy O’Shea’s Pub, the Irish pub of Beijing, by 12pm on the 17th. As it turns out, the Pub had started their St. Patrick’s Day celebration on the 16th because they knew some people wouldn’t want to be completely hungover for work the next day.
Tumblr media
For me, though, 12pm on the 17th seemed a perfect time to start as it would be 12am on the morning of my grandmother’s would-be 91st birthday. I could go on and on about how wonderful that woman was, and how big of an impact she had on me, but I’ll just say this: she was a good Bud. I’m obviously incredibly thankful for this teaching opportunity, but I’m struggling with being okay about missing out on St. Patrick’s Day in Northern Michigan this year. At least I was there last year and for Christmas and the New Year. That will have to be enough.
Anyway, I went with ML, S and another neighbor L, none of whom have ever truly celebrated St. Patrick’s Day like an American, let alone a Sylvain, but they were open to trying. And I was the one leading the group, which I still think is strange as I thought I was the least capable of the 4 of us at guiding a group through this very Chinese city. Fortunately, that compass in my brain works just as well on this side of the world as it does state-side.
Oh, and I looked damn fine, if I do say so myself!
We were a tad late to Paddy O’Shea’s, but the beer came quickly enough, and it tasted almost as good as it would have at the Side Door Saloon.
Tumblr media
I didn’t take a picture of the bangers and mash that I ordered, but I couldn’t have been more pleased.
One major difference between celebrating here versus back in the States: there were people born and raised in Ireland celebrating with us! And there was a really cute bartender from just outside Dublin that came to serve beer just for that evening...
OH! On the Wednesday before, one of my students asked if I was going to wear a green hat when I celebrated, and the rest of the room laughed. I didn’t get the joke, commented that I’d for sure wear my green tie but that I didn’t own a green hat. After inquiring about the hat, they shared that, in China, wearing a green hat sort of sends the message that you’re a cuckold. 
I would later find out the “historical basis” for this strange cultural faux pas: during the Warring States Era in China, there was a famous political icon who I was known to wear a green hat. Apparently he was a big deal, and he always wore a green hat. And then his wife cheated on him, so now a green hat means what it means. That’s it. That’s the whole story. It happened to one dude who happened to wear a green hat, and now it’s this huge thing that college students laugh about. *shrug*
Anyway, back to Paddy O’Shea’s. The bar itself was more “authentic” than I have grown to expect. I’ll probably pass the time in that pub a few more times before my time here is up. One of the key advantages is that it has a fully functioning website, which is something I’ve learned not to take for granted anymore. When I was searching in the days prior for a place to celebrate, I had stumbled across another bar: Molly Malone’s. Do not (I REPEAT: DO NOT) visit the website for Molly Malone’s. Especially at work. With the door open. When anybody and their mother could walk by.
The website, the one that the location on Google Maps and every other map app links you to, looks like a mid ‘90′s website with a few notable images. I’ll describe it for you to the best of my memory: the background is all black, all of the text is placed in little white rectangles, all of which span the middle 40% of the site and fit jigsaw-like to form one large rectangle of questionable links. The font itself is in a variety of cheap styles and bright, neon colors. Flashing text, coloring-changing text. The works. Again: it looked like a mid ‘90′s website. But not just any mid ‘90′s website.
A mid ‘90′s website with vulgar images that would make a 12 year old blush and fidget uncomfortably in their seat. I repeat again: do not visit this site! WHY IS THIS THE OFFICIAL SITE FOR THE WEBSITE?
And when I found out that, not only is this bar a real place that happens to be near a few foreign embassies and it is reportedly not-too-difficult to find a “lady of the night” in its vicinity, I wasn’t surprised? Why is it that those two pieces of information just “fit together?”
*sigh*
Paddy O’Shea’s, in contrast, is an upstanding establishment. And though they had started their party the day before and kept it going all night, the place was still in remarkably good shape, all things considered. Most of the seating was filled when we arrived, but by the time I left around 8pm (I’m completely guessing here; I have no idea what time it was), all of the standing room was occupied.
ML and S seemed quite gungho about having an Irish Car Bomb, while L was shocked that anyone would use such a phrase to describe a beverage. Unfortunately, ML had some grading to get back to, so they left before we ordered one, but not before some rando came by and spray-dyed my beard and S’s hair green.
Tumblr media
The dude in the middle isn’t the guy that did the coloring; just another “victim.”
Tumblr media
Not too long after, my officemate showed up; it was comforting to have someone there who had a decent grasp on the holiday!
Tumblr media
The non-Americans left soon thereafter, but AL and I kept ourselves sufficiently “entertained.” His friend P was also meeting us! She don’t think she’s ever really celebrated St. Patrick’s Day either, but she joined AL and myself in our one and only Irish Car Bomb of the day. Kudos to her!
AL and I chatted the next day and confided that we were both a bit more pissed than we thought...
NR also came out to join us, but she didn’t arrive until after P was getting hungry. Although why she didn’t seem interested in bangers and mash, I have no idea. When AL and P left, P made sure to leave me with some chaperones, a group of ex-pats from several other countries who P had joined for a shot of Fireball. For some reason. P was terrified at the idea of leaving me alone at a bar in Beijing. As if anything could go wrong?! I was with my people!
Anyway, I chatted up a nice girl from Texas, mostly about teaching because what else do I talk about nowadays, and NR finally showed. The good sport that she is, she joined me for another beer, and then we left to find food elsewhere. The place was getting to be a bit to much; she had just arrived, my voice was on it’s way out, and her’s would have joined it not too long after.
As it turns out, there was a place just around the corner that specialized in Peking Duck, something that AL and I were both quite curious to try thanks to KFC’s interesting spin on it...
But again: my beard was green. And I wouldn’t say that I was loaded, but there were at least four rounds in my six-shooter, if you catch my meaning. And this restaurant was niiiiiiiiceeee!!! There were 4 different people who helped us before we got to our table: one took our reservation, another led us to the stairs, a third took us up the stairs, and a fourth led us the last 10 feet to our table.
Tumblr media
In hindsight, I was worried that it was just the way-too-many-beers-prior-to-entering-this-establishment that made watching this guy slice the duck so fascinating, but NR mentioned the followed day that she found the experience just as captivating.
Tumblr media
Also, I’ve never been one for bathroom selfies...but when (drunk) in Rome (and by Rome, I mean a restaurant that I have no business being in), you do as Romans do. (Fun fact: Roman’s invented selfies. #themoreyouknow #notfakenews #youhearditherefirst)
Tumblr media
#romansdidntinventselfies #dontberidiculous #leavetheridiculousnesstomeandmygreenbeard
youtube
Seriously. This dude was awesome. I wish we had more footage...Guess you’ll just have to go there for yourself!
youtube
We also ordered several other dishes, all of which were amazing. Some shrimp, some part of a lamb, I think. All of it was good. Like everything else I’ve had in China!
All in all, the weekend was dope, the week after was less-so, and the coming weekends will be amazing. My students had their first exam this week, and on Tuesday I ordered an American cheeseburger and a Budweiser from a western-style restaurant just to see if it holds up out here. It was...so-so. Last night, I joined a couple friends for a drink at a bar called “Lush;” apparently it was open-mic night. One of the guys I was with was hoping for an environment more conducive to idle chit-chat amongst the group, so we ended up leaving after only one. I was displeased as I was having a great time. Guess I’ll just have to wander back out that way on my own sometime.
The plan for Sunday was to visit the Forbidden City, but I guess they ran out of tickets, so we’ll find something else to do. Will post after that. The weekend after is a Craft Beer festival that several of the faculty here will be visiting. I’m pumped.
OH! And I think I’ll be visiting Shanghai at the end of April! I didn’t know this, but apparently Shanghai was all grassland like 50 years ago! (This, according to one of the guys last night. Feel free to fact-check this.)
It’s going to be an interesting couple of weeks...
If only I could get my sleep schedule back on track. This whole “falling asleep at 4am and waking up at Noon” business is getting ridiculous. I blame my teaching schedule. #ishouldntcomplainbecauseimteachinginchina
Sláinte,
BeardyAllen
P.S. I’m super pumped for Shazam! And the End Game trailers are driving me up a wall...
1 note · View note
wallpaperpainter · 4 years
Text
The Seven Secrets You Will Never Know About M Graham Oil Paint | M Graham Oil Paint
FLORENCE, Italy (AP) – The Uffizi Galleries, the most-visited building in Italy, is accessible afterwards three months of COVID-19 lockdown, delighting art lovers who don’t accept to jostle with throngs of tourists acknowledgment to new amusing break rules.
Uffizi administrator Eike Schmidt told The Associated Press on Wednesday that the government-ordered cease of museums during coronavirus ascendancy measures meant 1 actor beneath visitors and 12 actor euros ($13.2 million) in beneath acquirement for that period. Now, at best 450 bodies at one time are accustomed in the Uffizi’s abounding galleries, brimming abounding of some of the art world’s greatest masterpieces.
That agency visitors no best accept to bend their way to adore such masterpieces as Botticelli’s “Birth of Venus.”
First in band to access was Laura Ganino. She was belief in Florence back the lockdown was declared in aboriginal March and now was assuredly about to leave the Tuscan city, back Italy on Wednesday alone restrictions on biking amid regions in the country.
Schmidt said tourists from across weren’t accepted to appear to Italy in ample numbers acceptable afore 2021. Ganino took advantage of the abate cardinal of visitors. Crowds, she said, affectation “an obstacle amid me and what I’m observing.”
Right abaft her in band was Patrizia Spagnese, from Prato in Tuscany. With crowds, “I get distracted, I tend to annoy easily,” she said, so with her bedmate she was acquisitive to flavor the beauties central the Uffizi, which she had never apparent in its absoluteness admitting abounding times actuality in Florence.
Schmidt said amusing break heralds a new era in art experience. Afterwards actuality amidst by
The Seven Secrets You Will Never Know About M Graham Oil Paint | M Graham Oil Paint – m graham oil paint | Welcome for you to my own website, within this occasion I’m going to teach you with regards to keyword. And after this, this can be a first impression:
Tumblr media
M | m graham oil paint
What about image previously mentioned? is in which awesome???. if you think maybe consequently, I’l d demonstrate several photograph all over again under:
So, if you want to acquire these incredible photos related to (The Seven Secrets You Will Never Know About M Graham Oil Paint | M Graham Oil Paint), click save icon to store these pics for your pc. There’re prepared for save, if you’d rather and want to own it, click save badge in the web page, and it will be directly saved in your home computer.} At last if you want to find unique and recent picture related with (The Seven Secrets You Will Never Know About M Graham Oil Paint | M Graham Oil Paint), please follow us on google plus or bookmark this page, we try our best to offer you regular up-date with fresh and new graphics. We do hope you like staying here. For most upgrades and latest news about (The Seven Secrets You Will Never Know About M Graham Oil Paint | M Graham Oil Paint) pictures, please kindly follow us on tweets, path, Instagram and google plus, or you mark this page on book mark section, We try to present you update regularly with all new and fresh images, love your browsing, and find the best for you.
Thanks for visiting our website, contentabove (The Seven Secrets You Will Never Know About M Graham Oil Paint | M Graham Oil Paint) published .  Nowadays we’re excited to declare we have discovered a veryinteresting nicheto be reviewed, namely (The Seven Secrets You Will Never Know About M Graham Oil Paint | M Graham Oil Paint) Some people trying to find info about(The Seven Secrets You Will Never Know About M Graham Oil Paint | M Graham Oil Paint) and definitely one of these is you, is not it?
Tumblr media
Save On Discount M Graham Co Artist Oil Paint, Azo Green .. | m graham oil paint
Tumblr media
Amazon.com: M. Graham Tube Oil Paint Basic Color 5-Color .. | m graham oil paint
Tumblr media
Café Man Ray (1968) – Man Ray (1890 – 1976) – m graham oil paint | m graham oil paint
Tumblr media
M Graham Artists Oil Paint Colors – M Graham Artists Paint .. | m graham oil paint
Tumblr media
Watercolour Painting – Indian Artist Anikartick,Chennai,Tamil Nadu,India – m graham oil paint | m graham oil paint
Tumblr media
Oil Paint Review and Tint Test :: M. Graham & Co | m graham oil paint
Tumblr media
sermon on the mount (NJ Cottingham, 1848) – m graham oil paint | m graham oil paint
Tumblr media
M. Graham Artist Oil Paint Titanium White 5oz Tube .. | m graham oil paint
Tumblr media
Christ carries his cross (NJ Cottingham, 1848) – m graham oil paint | m graham oil paint
Tumblr media
M. Graham : Artists’ Oil Paint : 37ml : Zinc White .. | m graham oil paint
Tumblr media
M. Graham Artist Oil Paint Yellow Ochre 1.25oz/37ml Tube .. | m graham oil paint
Tumblr media
23 best images about M. Graham on Pinterest | Color wheels .. | m graham oil paint
Tumblr media
Watercolour Painting – Indian Artist Anikartick,Chennai,Tamil Nadu,India – m graham oil paint | m graham oil paint
Tumblr media
M. Graham Oil Colours – Made with Walnut Oil – Jackson's .. | m graham oil paint
Tumblr media
M Graham & Co. Artists Oil Paint 1.25 oz/37 ml tubes .. | m graham oil paint
Tumblr media
M. Graham Oil Colours – Made with Walnut Oil – Jackson’s .. | m graham oil paint
Tumblr media
M. Graham Artist Oil Paint Ultramarine Blue 1.25oz/37ml .. | m graham oil paint
Tumblr media
memories of 1966 – m graham oil paint | m graham oil paint
Tumblr media
M. Graham Oil Colours – Made with Walnut Oil – Jackson’s .. | m graham oil paint
Tumblr media
M. Graham Artist Oil Paint Titanium White Rapid Dry 5oz .. | m graham oil paint
Tumblr media
M | m graham oil paint
Tumblr media
Watercolour Painting – Indian Artist Anikartick,Chennai,Tamil Nadu,India – m graham oil paint | m graham oil paint
The post The Seven Secrets You Will Never Know About M Graham Oil Paint | M Graham Oil Paint appeared first on Painter Legend.
Painter Legend https://www.painterlegend.com/wp-content/uploads/2020/06/m-graham-oil-paints-150ml-m-graham-oil-paint.jpg
0 notes
Link
How Much Does Homie Cost?
Does the thought of spending $$$ on real estate commissions when you’re buying or selling a home make you shiver? It downright scares us to think about all the money that has been made off of buyers and sellers and their investments (AKA their homes). At Homie, we feel so passionate about changing the way people buy and sell homes, we set out on a mission to simplify the process while saving home buyers and sellers thousands of dollars.
How much does it cost to use Homie?
Savings While Selling
The average Homie seller saves $10k compared to traditional agents.
Now that we’ve got your attention, let’s talk about how Homie saves home sellers a lot of money. Your home is YOUR investment.
Flat Fees, Are the Bee’s Knees
Homie charges sellers a flat fee to sell their home. That’s it. No hidden fees, no surprises.
What Does a Flat Fee Get Me?
Just because we don’t charge a high commission doesn’t mean we don’t provide a full-service real estate experience. With Homie you get:
A dedicated real estate agent
A Comparative Market Analysis (CMA) on your home
Pricing assistance
Professional photography
Listing on Homie.com
Listing on the MLS
Listing on major real estate websites
Negotiation assistance
We might emphasize the savings, but we don’t skimp on service. Our priority is to make real estate stress free. To really drive home our point, we went a step further. Everything you need to manage during the sale of your home can be found on our website or on our app. You can set your availability for tours, review offers, and anything else you need all in one place. Real estate + technology = a beautiful, beautiful partnership.
Savings While Buying
You can get up to $5k back when you buy with Homie.
Oh yeah. That’s right. As a Homie buyer, you save on Buyer’s Agent Commission (BAC) because our agents are full-time salaried employees who don’t depend on commission for their income.
What Else Do I Get?
We can give you up to $5k back when you buy with us, and what else?
As a Homie buyer, you get a team of dedicated agents who are determined to help you find the home you’ve been dreaming of. You also get access to our platform. On Homie.com or on our app you can browse homes, schedule tours, and put in offers. It’s 2020, tech is the best way to get anything done, including buying a home.
How Does Homie Get Paid?
When a homeowner decides to put their home up for sale, they usually outline a percentage of the sale they are willing to offer a buyer’s agents who help someone purchase the home (that percentage is called the Buyer’s Agent Commission or the BAC). This is a negotiable percentage, but it typically hovers around 3%.
The buyer gets up to $5k of that would-be commission, and Homie keeps the rest.
Because Homie utilizes tech to speed up and simplify the home buying process, we can take a smaller percentage of the buyer’s agent commission than a traditional agent would.
Why Does Homie Do What They Do?
Why did Homie decide to make waves in the real estate industry? Because it needed it. Here’s a quick history lesson.
Back in the day, buyer’s agents used to work full time to find homes for their clients. After all, there was no internet. Agents scoured the streets for homes for sale, contacted agents and arranged tours, they did a lot of heavy lifting. In fact, they had to connect with other agents to find out who was selling what when all the time.
Listing agents networked with buyer’s agents in major cities to make sure all the homes they were selling had immediate buyers. They told their clients how to make their homes more desirable. Agents would create placards for the homes they have for sale and place them outside to gather attention.
Nowadays, that work has significantly reduced.
We’re not saying that agents don’t work hard. We’re saying that the job description of an agent has changed. That’s why Homie decided that the way the real estate industry works should change, too.
Right Here, Right Now
What are home buyers and sellers doing now that caused an agent’s job to change? For starters, homes are listed online, meaning that anyone with access to the interwebs can find homes that fit their criteria. Oftentimes, buyers find the homes they want to see by themselves and an agent helps them gain access to the home for a tour.
What buyers need now is an agent who:
Helps them gain access to homes for showings.
Understands contracts and legal documents.
Knows the local real estate market.
Assists in crafting the perfect offer.
Knows how to negotiate.
Walks them through the settlement and closing process.
In addition to needing different services from agents, buyers now want to save money. You’ve probably heard that, in the real estate world, buyers don’t have to pay their agent. While it may SEEM that way, that sentiment is not entirely true. A Buyer’s Agent Commission (BAC) is offered by the seller of a home. They can choose the percentage they’re willing to pay. Still sound confusing? We know. A lot of things in the world of real estate are confusing. That’s why one of Homie’s goals is to educate and advise. Here’s how it breaks down:
->Seller offers BAC.
->Seller adjusts (raises it) the price of their home to accommodate BAC.
->The buyer finances the home for the adjusted price to include the BAC.
Ahhh. So there it is. BAC is covered in the cost of the home, so what that means is: buyers usually end up financing the cost of the BAC into the cost of their mortgage. This trickle down cost to buyers has become a concern to homebuyers in every stage, but even more so to first-time buyers.
Like everyone else with a smartphone, they want an easy-to-use app to browse homes, book tours, and put in offers. Because we are the people with smartphones that want ease of use, we created the app that solves this problem. Buyers can download the Homie app and look at all the homes that their hearts desire. Once they find a home they want to look at, they can book a tour directly from the app. The app’s uses go on and on, but you just need to know that it’s easy (and some would say rad) to use.
Now what do sellers want?
They want a listing agent who:
Helps them get professional pics of their home.
Lists their home on major real estate websites, including the MLS.
Markets their home to its full potential.
Assists in negotiating offers.
Walks them through the settlement and closing process.
The short story is, they want someone with industry knowledge to help them through the process of selling a home, but they don’t want to have to give up an arm and a leg to pay for it. That’s where Homie steps in. We charge sellers a flat fee to help them sell their home. You won’t find high commissions here. In addition to saving green, today’s sellers live in the world of credit cards and Amazon Prime; that means they want a quick, tech-enabled process. We’ve got a solution for that problem, too. Sellers can take advantage of our website, homie.com, or our app. With one login you can manage your listing, review offers, and sign paperwork. It’s no Candy Crush, but it’s almost as fun.
What’s the Bottom Line?
For sellers, Homie costs a flat fee.
For buyers, Homie will give up to $5k back of what would have been the buyer’s agent commission. Homie keeps what’s left after we give you yours.
Still Have Questions?
We want to give you an answer. Give us a ring by calling (385) 429-6888. You can also visit our buy and sell pages to learn more about each process.
The post How Much Does Homie Cost? appeared first on Homie Blog.
https://ift.tt/2um5RbO
0 notes
Text
Addicted to Actual Property - Why I Can't Stop and Why You Must Begin
Then and Today
Ten years before, a search for property would have began in the office of an area real estate agent or just by driving around town. At the agent's company, you'd spend a day flicking through pages of effective property entries from the area Numerous List Company (MLS). Following selecting properties of curiosity, you'd spend several months touring each property and soon you discovered the best one. Locating market data allow one to assess the price tag might take more time and a lot more driving, and you still might not be able to find all of the information you required to obtain really more comfortable with a fair market value.
Nowadays, many property real estate searches begin the Internet. A quick keyword research on Google by area will more than likely get you tens of thousands of results. In the event that you spot a house of curiosity on a property internet site, you are able to on average see images on the web and possibly even have a virtual tour. Then you're able to check always different The websites, such as the local region assessor, to obtain a concept of the property's price, see what the present owner paid for the property, check always the real house fees, get census data, school information, and actually have a look at what stores are within strolling distance-all without causing your property!
While the sources on the Net are convenient and helpful, using them properly can be quite a challenge due to the volume of information and the difficulty in verifying its accuracy. At the time of writing, a search of "Denver property" returned 2,670,000 Internet sites. Even a neighbor hood unique look for property can quickly reunite tens of thousands of Internet sites. With so several sources on the web so how exactly does an investor efficiently use them without getting bogged down or twisting up with imperfect or poor information? Believe it or perhaps not, understanding how the company of property performs traditional makes it easier to comprehend on the web property information and strategies.
The Company of Actual House
Real-estate is normally ordered and sold both through a licensed real estate agent or right by the owner. The vast majority is ordered and sold through property brokers. (We use "agent" and "broker" to reference exactly the same professional.) This really is because of their property understanding and experience and, at the very least historically, their unique access to a repository of effective properties for sale. Access to this repository of property entries presented the absolute most effective way to search for properties.
The repository of residential, land, and smaller income making properties (including some industrial properties) is frequently called a numerous listing support (MLS). In most cases, just properties outlined by member property agents may be put into an MLS. The principal purpose of an MLS is allow the member property agents to produce offers of settlement to different member agents if they find a buyer for a property.
Industrial property entries are also exhibited on the web but aggregated industrial property information is more elusive. Bigger MLSs frequently operate a commercial information trade (CIE). A CIE is similar to an MLS however the agents adding the entries to the repository are not expected to supply any unique kind of settlement to the other members. Payment is negotiated beyond your CIE.
In most cases, for-sale-by-owner properties can not be right put into an MLS and CIE, which are generally maintained by REALTOR associations. The lack of a maintained centralized repository could make these properties harder to locate. Usually, these properties are found by driving around or searching for advertisements in the area newspaper's property listings. A more efficient method to locate for-sale-by-owner properties is to search for a for-sale-by-owner Web page in the geographical area.
One reason is that the majority of the 1 million roughly REALTORS have The websites, and many of the The websites have various levels of the area MLS or CIE property information exhibited on them. Another reason is that there are lots of non-real house agent The websites that also offer property information, including, for-sale-by-owner websites, foreclosure websites, regional and global listing websites, District assessor websites, and valuation and market information sites. The ton of property information to the Net absolutely makes the information more available but also more puzzling and susceptible to misunderstanding and misuse.
Improvements in the engineering behind the real house organization have caused several agents to alter the way they do business. In large portion, that is because of the quick accessibility many people will have to property entries and different property information. Furthermore, the Net and different technologies have automated a lot of the advertising and preliminary looking process for real estate. For example, people may see properties on the web and produce inquires via email. Brokers may use automated applications to deliver entries to people that match their property criteria. Therefore, some agents today limit the companies they feature and change their costs accordingly. A real estate agent might offer to market the property in the MLS but just provide restricted extra services. As time goes by, some property agents might offer companies in more of an ala carte fashion.
Some have argued that the Net makes REALTORS and the MLS less relevant. We think this will be fake in the extended run. It may change the role of the agent but can make knowledgeable, competent, and qualified REALTORS more relevant than ever. In reality, the amount of property agents has risen significantly in new years. No surprise, the Net has created local property an international business. Besides, Net or perhaps not, the easy truth remains that the purchase of real property is the greatest simple purchase most people produce within their life (or, for several investors, the greatest numerous buys around a lifetime) and they need expert help. Are you aware that MLS, it remains the absolute most trusted supply of property listing and sold information accessible and continues allow effective advertising of properties. Therefore, what's the big event of all the on the web property information?
Net Strategies
In the pieces that follow, we provide methods and ideas on how to use the Net to discover properties on the market and study information highly relevant to your final decision to buy the property. There are many property The websites that to choose and although we do not mean to support any unique Web page, we have discovered the people right here to be excellent sources typically or to be so popular they require mention. One method to test a Internet site's precision is to search for details about a house you previously own.
0 notes
myecommerce-blog1 · 6 years
Text
How You Can Drive Sales with Facebook Messenger Chatbots
Artificial Intelligence (AI), neuro-linguistic programming (NLP) and Machine Learning (ML) all sound like visions of the future. Or at least something reserved for the elite programmers and smartest engineers at companies like Google, Amazon and SpaceX.
The reality is that chatbots are bringing these innovations to everyone. While these technologies are still young, they have tremendous potential to help you grow your business today – and they can be implemented with no prior knowledge of programming at all.
There are a lot of products nowadays that make life easy for e-commerce entrepreneurs. You’ve got companies like Shopify that make it easy to set up your store, and companies like Oberlo that make it easy to add products to your store.
Chatbots are another tool you can add to the mix to make it easier to get sales.
Chatbots have the potential to do to e-commerce what e-commerce has done to traditional retail.
Think about that.
The Future of Shopping
We may not be too far from a time when customers no longer shop at superstores like Amazon, but within personal, real conversations privately.
Let’s take shopping for sneakers as an example. What’s the current buying experience like?
You go to a website you like – maybe you heard about it from a friend or found it through an intriguing Facebook promotion. Once you’re there, you find the shoe section, you filter by style, size and color, and then you’re presented with a list of options.
You compare a bunch or them and then you add a few to your cart so you can come back later. Maybe you will, maybe you won’t, although the research says that you probably won’t.
Now maybe you’ll get hit with a retargeted ad or see an advertisement on the way to work the following week and return to your abandoned cart to follow through with the purchase.
But what if you received a friendly reminder right on your phone? What if this reminder was personal, human even? What if this message came through the same channel that you use to catch up with your friends and family throughout your hectic life of e-mails and meetings, not to mention those telemarketers ignoring the fact that you’re on the Do Not Call list?
Do you think that might make it more compelling? Do you think your subconscious might connect better with that feeling of familiarity and individualism?
Now imagine that two weeks after your order arrives you get a friendly check-in from the customer service team:
“Hey Dave, how are you diggin’ your new kicks?”
You know they’re ready to help you with any issues, so you shoot back:
“They’re awesome! Thanks so much!”
And then you get an immediate response:
“Cool! As always, let me know if you need anything! If you’d like to leave a review about the Jordans you just bought, here’s the link: sneakerstore.com/review”
Now six months down the road (the usual time when you’d be looking for another pair of shoes, based on your order history), that chat comes back to life with a friendly check-in:
“Hey Dave, how’s it going? I just wanted to let you know about our big summer sale coming up!”
The chatbot becomes your personal open door to dialoguing with the company. 
They’ll send you tracking updates, process returns, and keep you up to date with the latest promotions, all customized to your taste.
If done properly, Facebook Chatbots provide enormous potential (a connection to 900 million people on FB Messenger) to drive sales and enhance the customer relationship. But these bots are new and foreign to most of us and aren’t superhuman (we’ll debate whether a computer can be superhuman later, okay?).
But if you aren’t careful, using a chatbot may backfire. You could annoy or confuse a customer, provide bad information or make it difficult to complete a purchase. A frustrated customer not only won’t buy, but they’re likely to suggest that others steer clear of your brand as well.
Back in 2015, messaging apps surpassed social apps in the number of monthly active users – and Facebook Messenger has 1.2 billion monthly users on its own. People around the world use it to easily keep in touch with friends and family – even those who tend to avoid that “social media stuff.”
Tumblr media
Let’s look at a few ways that e-commerce businesses are using chatbots to grow their business as well as some important tips for building the most effective bots you can. 
How to Use Facebook Messenger Bots
The best thing about chatbots is that they give you an automated, cost-effective way to communicate with your customers in a way that is more direct and personal than ever before.
It’s something you can easily add to your Facebook marketing strategy.
Here are some great ways to leverage the power of Facebook Messenger to connect with your customers:
1) Customer Service
Both social media and chat are becoming increasingly important for e-commerce customer service. One study found that you’ll lose as many as 15% of your customers if you ignore social media requests and that revenue per customer can grow 20-40% for businesses that do respond. Here’s what another study found:
44% of customers believe live chat is the most important feature a site can offer during a purchase. 
There are a few ways to use your chatbot to help with customer service. For starters, you can use it to field FAQs and provide simple answers. You can also use it to collect some basic information like e-mail, order number and description of the problem before passing it along to one of your support reps to take over.
It’s important to remember that this technology isn’t state of the art yet. You can’t expect your bot to handle any complicated or unique requests, so it’s essential that whenever you use bots you should have human support on standby to help answer any such questions.
2) Order Confirmation and Updates
You can also offer your customers an opportunity to use Messenger to handle order and shipping confirmations. Send out tracking numbers, order updates and solicit feedback all from a single message thread.
One way to really boost sales is to make it easy to reorder via Messenger. If the bot has a customer’s order history, you can offer options to browse and reorder straight from their phone or browser.
3) Upsell and Cross-Sell
While improving the buying experience will certainly increase sales in the long run, you can also promote new products and offers directly to your customers via Messenger.
You already know their buying habits and their demographic information. You’ve established a line of communication. Now you can use this platform to remind customers of products they might want to check out or new promotions you have coming up which might interest them.
There is one caveat here.
Do not be too pushy or annoy customers. Unless they opt in for it, you shouldn’t be blasting them with the latest promotions every weekend – just the promos and offers that are catered specifically to their interests and purchase history.
These messages can be very invasive to a customer as they’re going directly to their personal phones and computers in a way that mimics the conversational experience they have when chatting up a friend or a cute girl they met last Friday. The occasional, useful promotion will be welcome by most– but salesy spam will not.
Your bot serves at the pleasure of the customer. If you annoy them, not only will the communication line be severed, but it’ll leave a bad taste in the customers’ mouth, too.
4) Facilitate Sales
You can also try to facilitate your sales directly through chat.
However, keep in mind that these bots are still pretty basic. If you sell products like apparel with multiple options (size, color, style, etc.), it might be easier to direct your customers to browse your site in order to provide a better experience.
If not, you’ll definitely want to have a human on deck to help pitch in, as you can see from the example above. But if you sell a few straightforward products or wish to facilitate re-orders and add-on sales, this does provide an exciting opportunity.
Tips For Building Effective Chatbots
It’s important to keep in mind that no matter how cool and exciting these new bots are, they are still in the early stages and customers might not be used to interacting with them. Here are a few tips to help make the experience a good one.
1) Use Simple, Clear Language and Instructions
You can have some fun with greetings and witty jokes, but when it comes to the core functionality of your chatbot, make sure it’s easy to understand. The user should get a simple answer or solution and know what they need to do to move forward. Don’t leave them guessing as to what they need to say.
2) Use Guided Responses
You may have heard that NLP technology is helping computers understand complex sentences and formulate intelligent responses. While that’s true, your basic chatbot isn’t going to be equipped with the fanciest technology on the market.
You can solve this by using simple, clear language and prompting the user with options to respond. This maintains the flow and dynamic of the conversation without forcing customers to worry about formatting their answer properly.
Leaving questions open ended is likely to create a frustrating experience for shoppers. It forces them to guess what to say and increases the chance that they won’t have a coherent experience, thereby increasing the odds that they will leave before accomplishing their goal.
3) Don’t Be Pushy
I’ll say this again – your chatbots serve at the pleasure of your customer.
Before you can start chatting with anyone, they must approve and initiate contact. Users can also easily block your bot if it’s annoying them. Facebook does this to protect the platform from becoming the next generation of spam and maintain its status as the most popular chat app on earth.
It’s probably best to start by using your app to help improve your customers’ experience first by making it easier to manage returns, orders and customer service. Then slowly introduce promotions or selling opportunities. Monitor your customers’ reaction closely to make sure you’re not annoying anyone.
4) Have a Plan
Know who you’re building this bot for and what problem you’re trying to solve. Think about your customers right now and solicit their thoughts.
What part of the buying experience has the most friction? Where are they dropping off? What steps are costing you the most money or causing the most headaches?
Build a chatbot to solve those problems. One at a time.
5) Offer a Way to Speak with a Real Person
While it’s fine to get excited about this new technology, don’t expect to have robots fully servicing your customers like a Star Wars droid just yet (you’ve seen 2001: A Space Odyssey, right?).
Most chatbots are still programmed to respond to simple commands. And the last thing you want to do is leave your customer frustrated with an unresolved issue. Instead of making things easier and quicker for them, you’ll only infuriate them before they call you angrily, decide not to buy and/or leave a scathing review on all your social media.
This tip is especially important for Customer Service bots. Customers should always know that they can reach out to a real person for help. It’s a good idea to let them know how (like typing a certain command or calling your support line) in the beginning.
If you’re suggesting solutions to a problem, you may also want to offer a preset option like “Contact Support” in case their questions remain unanswered.
If you have the size and budget for it, it may even pay to have someone supervise the conversations in real time and jump in whenever things look messy.
Even though you need staff on standby, you’ll still benefit greatly from having bots handle the common and easy problems that would normally be repetitive and a waste of time. They can also do the legwork by collecting all the necessary information from your customer while she waits for your agent to connect.
6) Optimize and Update Your Bot
As with any new technology or marketing strategy, you’ll benefit from continued testing and improvement. Watch how your customers respond to new ideas and see which steps trip them up.
You won’t build the perfect system on your first try, but if you pay attention and aim to improve it each quarter, you’ll get more and more out of it. Watch technology trends and as this technology improves, you’ll be able to grow and do more with it. You’ll be lightyears ahead of the competition who wait until the next big press buzz to start.
How to Set Up Facebook Messenger Chatbots
There are many services out there that will help you set up Facebook Messenger Chatbots. In the beginning, you’ll probably want to use a platform like Chatfuel, OnSequel or Botsify. For a simple do-it-yourself guide to creating a chatbot, Social Media Examiner has a great post.
Tumblr media
Start by building a bot with a single goal like handling order confirmations or navigating your FAQ. Watch how your customers use it and refine it based on their feedback. Then slowly start adding features to help more customers and start actually driving new sales.
As a bonus, the early users will be excited to grow with your brand as they see their online shopping companion evolve based on their specific feedback.
While you could code these bots from scratch or hire a developer to build them for you (the same companies mentioned above offer custom development services), it’s really not necessary and probably a waste of money until you have a better idea of exactly how your customers are responding to it.
Conclusion
While we’re at least a few years (and it really might be just a few years) from building our own android assistant like C3PO, chatbots are already starting to help businesses mimic human conversations to enhance their relationships with customers.
That’s why Amazon and Apple both announced that they are focusing on machine learning technology in 2017 and why Shopify includes Messenger as an official sales channel.
Facebook Messenger chatbots are easy enough to set up on your own, so why wait to try it out?
Source: https://www.singlegrain.com/social-media-news/how-e-commerce-companies-can-drive-sales-with-facebook-messenger-chatbots/
0 notes
mbaljeetsingh · 6 years
Text
Build Custom Dashboards with MongoDB, Azure & Serverless Functions
This article was originally published on Ahmad Awais. Thank you for supporting the partners who make SitePoint possible.
TL;DR: I’m building a custom WordPress dashboard for an enterprise client which is powered by React.js on top of Node.js, with MongoDB Atlas as the database.
This dashboard uses several Microsoft Azure services, e.g., Cognitive Services, Azure App Services, and especially serverless Azure Functions. In this post, you’ll learn how to build a small module from it and the reason behind my choice of stack, apps, and products.
One of my enterprise clients who owns a huge networking and media company has a large-scale WordPress site set up. He recently consulted me about the possibility of building a custom WordPress dashboard (based on the WordPress REST API) — to help him make intelligent business decisions via Machine Learning and Artificial Intelligence.
With JavaScript eating up the world and WordPress adapting to the move by creating the Gutenberg project, I thought of an architecture/stack where WordPress would be our content layer, a familiar battle-tested environment that does its job well with a custom dashboard that’s built with JavaScript.
When you’re tasked to build a modern JavaScript application, you find yourself in a mix of different frameworks, tools, and dev workflows. The JavaScript ecosystem has grown a lot over the last couple of years. We have many, many good options available today.
So, after researching my options for a bit, I opted to use React.js on top of Node.js to start building the custom WordPress dashboard. While the project is in its ideation phase at the moment, I think it’s important that I share some of our goals here to define context behind my choice of the stack.
Custom WordPress Dashboard Goals
Imagine you own a large networking company where over 500 hotels (in three different countries) use your services to power their conference halls, IT meetings, and online property management like the sites and blogs. That’s what my client does.
Most of this is powered by a huge multi-site WordPress instance that manages everything for the hotels, websites, online booking, registrations, events, tickets, reviews, and comments. There’re also other systems running different software which are able to produce content via REST API.
We’ve set out to create a custom WordPress dashboard with many goals in mind, but I’m listing a few of them which are related to this particular article. Take a look at what I have built so far, it’s all based on serverless Azure functions — which are pretty awesome.
👀 High-level Data Reporting
The custom dashboard will report all the high-level data, e.g. things like live sales happening throughout my client’s portfolio (500+ hotels), entity/time-based and date-based breakdowns.
And how each of his franchise performing on a daily, weekly, monthly basis. All of this data is being fed to MongoDB Atlas. More on that later.
⚡ Serverless Automation
Most of the modules are built upon serverless architecture — which in this case provides huge benefits. All the automation is always running and the cost is paid as you go i.e. pay for what you use.
An initial rough estimate puts this solution 34% more economical than having a server VM running all the time. We are using Azure Functions for this serverless automation.
🔥 IoT (Internet of Things) Hub
There are about ~200 IT managers working for my client who have IoT enabled devices that feed data into several online projects. This custom dashboard also includes that data for making better decisions and connecting the whole registration, management, maintenance team’s hub into a single place.
As you might have already guessed, this project makes use of IoT Hub from Microsoft Azure to connect, monitor, and manage all of the IoT assets.
🤖 Machine Learning and Artificial Intelligence
We’re using a lot of different services from Microsoft Azure for the sake of making this dashboard artificially intelligent by Machine Learning.
There’s a huge dataset that is fed to the ML Studio which later helps us predict different decisions like space management, low registrations trends for IT events, and questions like why and when these things happen.
While the Machine Learning part is beyond the scope of this article, I still plan to touch base with some of the awesome Artificial Intelligence I’ve been able to cook in via Azure’s Cognitive Services.
🕰 Live & Real-time
One of the most important aspects of this custom dashboard is that it’s live and real-time. Which means I need a managed database that can cope with this amount of data and still stay highly available.
But at the same time, it’s for the management purposes and doesn’t need to have any impact on the WordPress sites. That is a crucial system design decision for this dashboard.
By that what I mean is we can do all sorts of experiments with this custom dashboard but it shouldn’t have any impact on the database/servers which are running the multi-site WordPress instance.
MongoDB & MongoDB Atlas
For this custom WordPress dashboard, I am using MongoDB Atlas as a DBaaS (Database as a Service). And I couldn’t be happier. When I first shared that I’d be using MongoDB, many developers had concerns.
Most of the questions asked why I'd add another layer of complexity by adding yet another database to the mix. Why not use the WordPress database as it is? To answer these questions and more I have prepared a list of reasons as to why I am using MongoDB Atlas.
♨ Dislike for RDBMS
I personally dislike relational databases. Most of the time, for me they get in the way of building applications. I have to completely get out of the app I am building, think about my database in the future and design a good schema which always ends up a bad exercise for my dev workflow. It’s counter-intuitive at best — at least for me, it is.
💸 HDD Is Cheap — CPU/RAM Is Not
Old databases were mostly designed to save disk space, among other things. This led to a plethora of problems like normalization, indexing, and made sharding, auto-scaling, and replication harder.
Nowadays, disk space is dirt cheap. On the other hand, CPU/RAM is not, and your sysadmin costs can skyrocket very quickly if you end up with a bad choice here.
Like you wanted to create a custom dashboard but your system design architect cost you two sysadmins with how they chose to design your system. Similarly, my client wanted a managed solution without having to hire a team of IT/DevOps folks — at least for an experimental custom dashboard.
🍀 MongoDB’s Pros
Schema-less. Flexible schema for the win. I don’t have to change anything, my regular app development workflow, creating a Node.js-based app that I am manipulating with JSON type data, I can just feed that into MongoDB and it just works.
Workflow-consistency. Creates documents the way my custom dashboard is represented. Sales, Videos, Talks, Comments, Reviews, Registrations, etc. all of that have similar data representation on the frontend and the backend — and even in the database. I manage 3rd party data via middleware. This consistency translates to clean code.
Ease of scale-out. It scales reads by using replica sets. Scales writes by using sharding (auto-balancing). Just fire up another machine and away you go. Most importantly, instead of vertical scaling via RDBMS, MongoDB lets you scale horizontally with different levels of consistency. That’s a big plus. ➕
Cost. Depends on which RDBMS of course, but MongoDB is free and can run on Linux, ideal for running on cheaper commodity kits.
🍃 Why MongoDB Atlas?
Well, now that I know MongoDB is the right database choice, there are so many different options to host your database. I can self-host on my Linux machine via DigitalOcean, use a cloud provider like AWS/Azure or a choose a DBaaS service specific to MongoDB.
But I want a fast, secure, and managed MongoDB solution that I can easily scale with the growth of the number of modules we attach in this custom WordPress dashboard. That’s MongoDB Atlas.
MongoDB Atlas is a cloud-hosted MongoDB service engineered and run by the same team that builds the database. And guess what, I trust that they follow the best operational practices since they are the ones who’re building MongoDB in the first place.
I want this custom dashboard to be self-managed, serverless, and using MongoDB Atlas saves me from worrying about software patching, backups, and reliable configuration setup for new DB updates. Again a big plus. ➕
Also, the fact that MongoDB Atlas is supported cross-platform as well as cross-region and across different cloud providers makes it a much better choice. I think each Cluster comes with two replica sets, ready to scale.
🔋 MongoDB Compass
Now that we are going to work with MongoDB, it’d be great to have a tool through which we can explore our database, view the changes, debug and so on. For this purpose, MongoDB again takes the lead with a product called MongoDB Compass. Take a look.
I suggest that you go ahead and download MongoDB Compass. It’s literally the best tool to visualize your MongoDB database. Here’s a set of features:
Visualize and explore: Take a look at your database, find out how things are looking, and even visualize stuff like maps/coordinates.
Insert, modify, and delete: You can also perform CRUD operations for your DB right from MongoDB compass. Makes testing easier.
Debug and optimize: Finally, analyze your data, debug it and even find out about performance issues right inside a great GUI for your database. This tool is a must-have if you work with MongoDB.
Extensible: And the best part is you can build your own plugins to extend MongoDB Compass. Here’s the documentation on building your own Compass plugins.
Enterprise Flavor: MongoDB Compass comes in a few flavors: Community (Free), and Enterprise (Licensed) — the Enterprise version is the one that lets you visualize DB schema.
✅ Getting Started with MongoDB Atlas
Let’s get started and build a simple module that's part of the custom WordPress dashboard I am building. For this module, we are collecting all the sales related data. For that, we need a MongoDB instance, and of course we’re using MongoDB Atlas here.
Step #1: Go to MongoDB Atlas →
Go to the MongoDB Atlas site and register a completely free MongoDB instance hosted on AWS, with shared RAM and 512 MB storage. Click the Get Started Free button.
Step #2: Sign up at MongoDB Atlas →
Now go ahead and sign up with your email ID and fill up the details. It’s amazing that you can sign up and use a free MongoDB Atlas hosted DB instance, and they don’t even require you to add a credit card for that.
Step #3: Create the Cluster
Now you’ll be redirected to a page with a bunch of information about the new MongoDB Cluster you’re about to create. I suggest that you review this information, and move ahead by clicking the Create Cluster button at the bottom just like in the screenshot below.
Step #4: Create DB Username & Password
It’ll take a minute and your DB will be created. Once that happens, head over to the Security > MongoDB Users and click on the + ADD NEW USER button on the right, to create a new user for your database. Let’s keep all the other settings set to default for the sake of this intro-article.
I’m setting the user/pass as usermongo but you know better.
Step #5: Add IP to Whitelist for Access
To be able to access your MongoDB Atlas database, you need to setup the IP Whitelist with the IP of your server where your app is hosted. Authentication is beyond what I am discussing here so for the purpose of this demo let’s just allow everyone (obviously a bad practice in production).
So, again, head over to the Security > IP Whitelist and click on the + ADD IP ADDRESS button on the right, and finally ALLOW ACCESS FROM ANYWHERE button to allow the anonymous access.
Step #6: Connect via MongoDB Compass
Now that our DB’s IP access and a user has been created, we can pick up the connection string and use it to connect to our database with our MongoDB Compass application.
Go to Connect then choose Connect with MongoDB Compass and download Compass if you haven’t. Copy the URI Connection String. Finally, open Compass and it should be able to detect the connection string in your clipboard, allow it to connect to your database.
And you are set to visualize your database, analyze its performance, and even run complete CRUD operations. Awesome! 💯
Now that we have created a MongoDB Atlas and connected it with MongoDB Compass, we can move forward and start building our Node.js application.
WordPress REST API — FTW!
This WordPress based Node.js custom dashboard interacts with the WordPress instance via the WordPress REST API. Since this is a Node.js app, I am using an awesome library called wpapi written by K Adam White. He has also built a demo Express based WordPress app. That’s what I got inspired by while building this custom dashboard, so you’ll see a lot of it here.
🚀 WordPress Custom Router Based on Express
The router is set up with Express. Here’s a basic error handler and router template for using WordPress with express.
'use strict'; var express = require('express'); var router = express.Router(); var siteInfoMiddleware = require('../middleware/site-info'); // Set global site info on all routes router.use(siteInfoMiddleware); // Public Routes // ============= router.get('/', require('./index')); router.get('/page/:page', require('./index')); router.get('/:slug', require('./single')); router.use('/tags/:tag', require('./tag')); router.use('/categories/:category', require('./category')); // Catch 404 and forward to error handler. router.use(function (req, res, next) { var err = new Error('Not Found'); err.status = 404; next(err); }); // Error Handling // ============== // Development error handler will print stacktrace. function developmentErrorRoute(err, req, res, next) { res.status(err.status || 500); res.render('error', { message: err.message, error: err }); } // Production error handler. No stacktraces leaked to user. function friendlyErrorRoute(err, req, res, next) { res.status(err.status || 500); res.render('error', { message: err.message, error: {} }); } // Configure error-handling behavior if (router.get('env') === 'development') { router.use(developmentErrorRoute); } else { router.use(friendlyErrorRoute); } module.exports = router;
View the code on Gist.
🎚 Basic Express Based Implementation
I am not hosting this entire thing on WordPress, but the initial plan was to do just that. If you want to go do that, you’d wanna build the index by querying all the info using the RSVP.hash utility for convenience and parallelism. For that here’s what you should do.
'use strict'; var wp = require( '../services/wp' ); var contentService = require( '../services/content-service' ); var pageNumbers = require( '../services/page-numbers' ); var pageTitle = require( '../services/page-title' ); var RSVP = require( 'rsvp' ); function getHomepage( req, res, next ) { var pages = pageNumbers( req.params.page ); RSVP.hash({ archiveBase: '', pages: pages, title: pageTitle(), // Primary page content posts: wp.posts().page( pages.current ), sidebar: contentService.getSidebarContent() }).then(function( context ) { if ( req.params.page && ! context.posts.length ) { // Invalid pagination: 404 return next(); } res.render( 'index', context ); }).catch( next ); } module.exports = getHomepage;
View the code on Gist.
🦏 Authentication Cooked In
For this setup, you’ll also need to authenticate your Node.js app by giving it the authentication data, which along with wpapi can be processed like this. Beware this is not always a best practice if you don’t use correct permissions and environment variables settings.
var WP = require( 'wordpress-rest-api' ); var _ = require( 'lodash' ); var config = _.pick( require( './config' ).wordpress, [ // Whitelist valid config keys 'username', 'password', 'endpoint' ]); var wp = new WP( config ); module.exports = wp;
View the code on Gist.
🦁 Site Content Accumulation
And finally, you are able to consume all the content by creating a content service which handles recursively fetching:
All the pages of a paged collection.
Your WordPress site’s info.
An alphabetized list of categories.
A specific category (specified by slug) from the content cache.
An alphabetized list of tags.
A specific tag (specified by slug) from the content cache
Other content required to have some feature parity with WP.
The code for this looks somewhat like this.
'use strict'; var wp = require( './wp' ); var cache = require( './content-cache' ); var _ = require( 'lodash' ); var RSVP = require( 'rsvp' ); /** * Recursively fetch all pages of a paged collection * * @param {Promise} request A promise to a WP API request's response * @returns {Array} A promise to an array of all matching records */ function all( request ) { return request.then(function( response ) { if ( ! response._paging || ! response._paging.next ) { return response; } // Request the next page and return both responses as one collection return RSVP.all([ response, all( response._paging.next ) ]).then(function( responses ) { return _.flatten( responses ); }); }); } function siteInfo( prop ) { var siteInfoPromise = cache.get( 'site-info' ); if ( ! siteInfoPromise ) { // Instantiate, request and cache the promise siteInfoPromise = wp.root( '/' ).then(function( info ) { return info; }); cache.set( 'site-info', siteInfoPromise ); } // Return the requested property return siteInfoPromise.then(function( info ) { return prop ? info[ prop ] : info; }); } /** * Get an alphabetized list of categories * * All archive routes display a sorted list of categories in their sidebar. * We generate that list here to ensure the sorting logic isn't duplicated * across routes. * * @method sortedCategories * @return {Array} An array of category objects */ function sortedCategories() { return all( wp.categories() ).then(function( categories ) { return _.chain( categories ) .sortBy( 'slug' ) .value(); }); } function sortedCategoriesCached() { var categoriesPromise = cache.get( 'sorted-categories' ); if ( ! categoriesPromise ) { categoriesPromise = sortedCategories(); cache.set( 'sorted-categories', categoriesPromise ); } return categoriesPromise; } /** * Get a specific category (specified by slug) from the content cache * * The WP API doesn't currently support filtering taxonomy term collections, * so we have to request all categories and filter them down if we want to get * an individual term. * * To make this request more efficient, it uses sortedCategoriesCached. * * @method categoryCached * @param {String} slug The slug of a category * @return {Promise} A promise to the category with the provided slug */ function categoryCached( slug ) { return sortedCategoriesCached().then(function( categories ) { return _.findWhere( categories, { slug: slug }); }); } /** * Get a specific tag (specified by slug) from the content cache * * The WP API doesn't currently support filtering taxonomy term collections, * so we have to request all tags and filter them down if we want to get an * individual term. * * To make this request more efficient, it uses the cached sortedTags promise. * * @method tagCached * @param {String} slug The slug of a tag * @return {Promise} A promise to the tag with the provided slug */ function tagCached( slug ) { return sortedTagsCached().then(function( tags ) { return _.findWhere( tags, { slug: slug }); }); } /** * Get an alphabetized list of tags * * @method sortedTags * @return {Array} An array of tag objects */ function sortedTags() { return all( wp.tags() ).then(function( tags ) { return _.chain( tags ) .sortBy( 'slug' ) .value(); }); } function sortedTagsCached() { var tagsPromise = cache.get( 'sorted-tags' ); if ( ! tagsPromise ) { tagsPromise = sortedTags(); cache.set( 'sorted-tags', tagsPromise ); } return tagsPromise; } function getSidebarContent() { return RSVP.hash({ categories: sortedCategoriesCached(), tags: sortedTagsCached() }); } module.exports = { // Recursively page through a collection to retrieve all matching items all: all, // Get (and cache) the top-level information about a site, returning the // value corresponding to the provided key siteInfo: siteInfo, sortedCategories: sortedCategories, sortedCategoriesCached: sortedCategoriesCached, categoryCached: categoryCached, tagCached: tagCached, sortedTags: sortedTags, sortedTagsCached: sortedTagsCached, getSidebarContent: getSidebarContent };
View the code on Gist.
🛠 Custom Routes & Sales Data
Finally, I have cooked in quite a few custom routes from where I can attain any kind of sales related data. For the particular architecture I have in place, I’m again using the RSVP.hash utility for convenience and parallelism. It works like a charm.
The post Build Custom Dashboards with MongoDB, Azure & Serverless Functions appeared first on SitePoint.
via SitePoint https://ift.tt/2ubEcGK
0 notes
t-baba · 6 years
Photo
Tumblr media
Build Custom Dashboards with MongoDB, Azure & Serverless Functions
This article was originally published on Ahmad Awais. Thank you for supporting the partners who make SitePoint possible.
TL;DR: I’m building a custom WordPress dashboard for an enterprise client which is powered by React.js on top of Node.js, with MongoDB Atlas as the database.
This dashboard uses several Microsoft Azure services, e.g., Cognitive Services, Azure App Services, and especially serverless Azure Functions. In this post, you’ll learn how to build a small module from it and the reason behind my choice of stack, apps, and products.
One of my enterprise clients who owns a huge networking and media company has a large-scale WordPress site set up. He recently consulted me about the possibility of building a custom WordPress dashboard (based on the WordPress REST API) — to help him make intelligent business decisions via Machine Learning and Artificial Intelligence.
With JavaScript eating up the world and WordPress adapting to the move by creating the Gutenberg project, I thought of an architecture/stack where WordPress would be our content layer, a familiar battle-tested environment that does its job well with a custom dashboard that’s built with JavaScript.
When you’re tasked to build a modern JavaScript application, you find yourself in a mix of different frameworks, tools, and dev workflows. The JavaScript ecosystem has grown a lot over the last couple of years. We have many, many good options available today.
So, after researching my options for a bit, I opted to use React.js on top of Node.js to start building the custom WordPress dashboard. While the project is in its ideation phase at the moment, I think it’s important that I share some of our goals here to define context behind my choice of the stack.
Custom WordPress Dashboard Goals
Imagine you own a large networking company where over 500 hotels (in three different countries) use your services to power their conference halls, IT meetings, and online property management like the sites and blogs. That’s what my client does.
Most of this is powered by a huge multi-site WordPress instance that manages everything for the hotels, websites, online booking, registrations, events, tickets, reviews, and comments. There’re also other systems running different software which are able to produce content via REST API.
We’ve set out to create a custom WordPress dashboard with many goals in mind, but I’m listing a few of them which are related to this particular article. Take a look at what I have built so far, it’s all based on serverless Azure functions — which are pretty awesome.
👀 High-level Data Reporting
The custom dashboard will report all the high-level data, e.g. things like live sales happening throughout my client’s portfolio (500+ hotels), entity/time-based and date-based breakdowns.
And how each of his franchise performing on a daily, weekly, monthly basis. All of this data is being fed to MongoDB Atlas. More on that later.
⚡ Serverless Automation
Most of the modules are built upon serverless architecture — which in this case provides huge benefits. All the automation is always running and the cost is paid as you go i.e. pay for what you use.
An initial rough estimate puts this solution 34% more economical than having a server VM running all the time. We are using Azure Functions for this serverless automation.
🔥 IoT (Internet of Things) Hub
There are about ~200 IT managers working for my client who have IoT enabled devices that feed data into several online projects. This custom dashboard also includes that data for making better decisions and connecting the whole registration, management, maintenance team’s hub into a single place.
As you might have already guessed, this project makes use of IoT Hub from Microsoft Azure to connect, monitor, and manage all of the IoT assets.
🤖 Machine Learning and Artificial Intelligence
We’re using a lot of different services from Microsoft Azure for the sake of making this dashboard artificially intelligent by Machine Learning.
There’s a huge dataset that is fed to the ML Studio which later helps us predict different decisions like space management, low registrations trends for IT events, and questions like why and when these things happen.
While the Machine Learning part is beyond the scope of this article, I still plan to touch base with some of the awesome Artificial Intelligence I’ve been able to cook in via Azure’s Cognitive Services.
🕰 Live & Real-time
One of the most important aspects of this custom dashboard is that it’s live and real-time. Which means I need a managed database that can cope with this amount of data and still stay highly available.
But at the same time, it’s for the management purposes and doesn’t need to have any impact on the WordPress sites. That is a crucial system design decision for this dashboard.
By that what I mean is we can do all sorts of experiments with this custom dashboard but it shouldn’t have any impact on the database/servers which are running the multi-site WordPress instance.
MongoDB & MongoDB Atlas
For this custom WordPress dashboard, I am using MongoDB Atlas as a DBaaS (Database as a Service). And I couldn’t be happier. When I first shared that I’d be using MongoDB, many developers had concerns.
Most of the questions asked why I'd add another layer of complexity by adding yet another database to the mix. Why not use the WordPress database as it is? To answer these questions and more I have prepared a list of reasons as to why I am using MongoDB Atlas.
♨ Dislike for RDBMS
I personally dislike relational databases. Most of the time, for me they get in the way of building applications. I have to completely get out of the app I am building, think about my database in the future and design a good schema which always ends up a bad exercise for my dev workflow. It’s counter-intuitive at best — at least for me, it is.
💸 HDD Is Cheap — CPU/RAM Is Not
Old databases were mostly designed to save disk space, among other things. This led to a plethora of problems like normalization, indexing, and made sharding, auto-scaling, and replication harder.
Nowadays, disk space is dirt cheap. On the other hand, CPU/RAM is not, and your sysadmin costs can skyrocket very quickly if you end up with a bad choice here.
Like you wanted to create a custom dashboard but your system design architect cost you two sysadmins with how they chose to design your system. Similarly, my client wanted a managed solution without having to hire a team of IT/DevOps folks — at least for an experimental custom dashboard.
🍀 MongoDB’s Pros
Schema-less. Flexible schema for the win. I don’t have to change anything, my regular app development workflow, creating a Node.js-based app that I am manipulating with JSON type data, I can just feed that into MongoDB and it just works.
Workflow-consistency. Creates documents the way my custom dashboard is represented. Sales, Videos, Talks, Comments, Reviews, Registrations, etc. all of that have similar data representation on the frontend and the backend — and even in the database. I manage 3rd party data via middleware. This consistency translates to clean code.
Ease of scale-out. It scales reads by using replica sets. Scales writes by using sharding (auto-balancing). Just fire up another machine and away you go. Most importantly, instead of vertical scaling via RDBMS, MongoDB lets you scale horizontally with different levels of consistency. That’s a big plus. ➕
Cost. Depends on which RDBMS of course, but MongoDB is free and can run on Linux, ideal for running on cheaper commodity kits.
🍃 Why MongoDB Atlas?
Well, now that I know MongoDB is the right database choice, there are so many different options to host your database. I can self-host on my Linux machine via DigitalOcean, use a cloud provider like AWS/Azure or a choose a DBaaS service specific to MongoDB.
But I want a fast, secure, and managed MongoDB solution that I can easily scale with the growth of the number of modules we attach in this custom WordPress dashboard. That’s MongoDB Atlas.
MongoDB Atlas is a cloud-hosted MongoDB service engineered and run by the same team that builds the database. And guess what, I trust that they follow the best operational practices since they are the ones who’re building MongoDB in the first place.
I want this custom dashboard to be self-managed, serverless, and using MongoDB Atlas saves me from worrying about software patching, backups, and reliable configuration setup for new DB updates. Again a big plus. ➕
Also, the fact that MongoDB Atlas is supported cross-platform as well as cross-region and across different cloud providers makes it a much better choice. I think each Cluster comes with two replica sets, ready to scale.
🔋 MongoDB Compass
Now that we are going to work with MongoDB, it’d be great to have a tool through which we can explore our database, view the changes, debug and so on. For this purpose, MongoDB again takes the lead with a product called MongoDB Compass. Take a look.
I suggest that you go ahead and download MongoDB Compass. It’s literally the best tool to visualize your MongoDB database. Here’s a set of features:
Visualize and explore: Take a look at your database, find out how things are looking, and even visualize stuff like maps/coordinates.
Insert, modify, and delete: You can also perform CRUD operations for your DB right from MongoDB compass. Makes testing easier.
Debug and optimize: Finally, analyze your data, debug it and even find out about performance issues right inside a great GUI for your database. This tool is a must-have if you work with MongoDB.
Extensible: And the best part is you can build your own plugins to extend MongoDB Compass. Here’s the documentation on building your own Compass plugins.
Enterprise Flavor: MongoDB Compass comes in a few flavors: Community (Free), and Enterprise (Licensed) — the Enterprise version is the one that lets you visualize DB schema.
✅ Getting Started with MongoDB Atlas
Let’s get started and build a simple module that's part of the custom WordPress dashboard I am building. For this module, we are collecting all the sales related data. For that, we need a MongoDB instance, and of course we’re using MongoDB Atlas here.
Step #1: Go to MongoDB Atlas →
Go to the MongoDB Atlas site and register a completely free MongoDB instance hosted on AWS, with shared RAM and 512 MB storage. Click the Get Started Free button.
Step #2: Sign up at MongoDB Atlas →
Now go ahead and sign up with your email ID and fill up the details. It’s amazing that you can sign up and use a free MongoDB Atlas hosted DB instance, and they don’t even require you to add a credit card for that.
Step #3: Create the Cluster
Now you’ll be redirected to a page with a bunch of information about the new MongoDB Cluster you’re about to create. I suggest that you review this information, and move ahead by clicking the Create Cluster button at the bottom just like in the screenshot below.
Step #4: Create DB Username & Password
It’ll take a minute and your DB will be created. Once that happens, head over to the Security > MongoDB Users and click on the + ADD NEW USER button on the right, to create a new user for your database. Let’s keep all the other settings set to default for the sake of this intro-article.
I’m setting the user/pass as usermongo but you know better.
Step #5: Add IP to Whitelist for Access
To be able to access your MongoDB Atlas database, you need to setup the IP Whitelist with the IP of your server where your app is hosted. Authentication is beyond what I am discussing here so for the purpose of this demo let’s just allow everyone (obviously a bad practice in production).
So, again, head over to the Security > IP Whitelist and click on the + ADD IP ADDRESS button on the right, and finally ALLOW ACCESS FROM ANYWHERE button to allow the anonymous access.
Step #6: Connect via MongoDB Compass
Now that our DB’s IP access and a user has been created, we can pick up the connection string and use it to connect to our database with our MongoDB Compass application.
Go to Connect then choose Connect with MongoDB Compass and download Compass if you haven’t. Copy the URI Connection String. Finally, open Compass and it should be able to detect the connection string in your clipboard, allow it to connect to your database.
And you are set to visualize your database, analyze its performance, and even run complete CRUD operations. Awesome! 💯
Now that we have created a MongoDB Atlas and connected it with MongoDB Compass, we can move forward and start building our Node.js application.
WordPress REST API — FTW!
This WordPress based Node.js custom dashboard interacts with the WordPress instance via the WordPress REST API. Since this is a Node.js app, I am using an awesome library called wpapi written by K Adam White. He has also built a demo Express based WordPress app. That’s what I got inspired by while building this custom dashboard, so you’ll see a lot of it here.
🚀 WordPress Custom Router Based on Express
The router is set up with Express. Here’s a basic error handler and router template for using WordPress with express.
'use strict'; var express = require('express'); var router = express.Router(); var siteInfoMiddleware = require('../middleware/site-info'); // Set global site info on all routes router.use(siteInfoMiddleware); // Public Routes // ============= router.get('/', require('./index')); router.get('/page/:page', require('./index')); router.get('/:slug', require('./single')); router.use('/tags/:tag', require('./tag')); router.use('/categories/:category', require('./category')); // Catch 404 and forward to error handler. router.use(function (req, res, next) { var err = new Error('Not Found'); err.status = 404; next(err); }); // Error Handling // ============== // Development error handler will print stacktrace. function developmentErrorRoute(err, req, res, next) { res.status(err.status || 500); res.render('error', { message: err.message, error: err }); } // Production error handler. No stacktraces leaked to user. function friendlyErrorRoute(err, req, res, next) { res.status(err.status || 500); res.render('error', { message: err.message, error: {} }); } // Configure error-handling behavior if (router.get('env') === 'development') { router.use(developmentErrorRoute); } else { router.use(friendlyErrorRoute); } module.exports = router;
View the code on Gist.
🎚 Basic Express Based Implementation
I am not hosting this entire thing on WordPress, but the initial plan was to do just that. If you want to go do that, you’d wanna build the index by querying all the info using the RSVP.hash utility for convenience and parallelism. For that here’s what you should do.
'use strict'; var wp = require( '../services/wp' ); var contentService = require( '../services/content-service' ); var pageNumbers = require( '../services/page-numbers' ); var pageTitle = require( '../services/page-title' ); var RSVP = require( 'rsvp' ); function getHomepage( req, res, next ) { var pages = pageNumbers( req.params.page ); RSVP.hash({ archiveBase: '', pages: pages, title: pageTitle(), // Primary page content posts: wp.posts().page( pages.current ), sidebar: contentService.getSidebarContent() }).then(function( context ) { if ( req.params.page && ! context.posts.length ) { // Invalid pagination: 404 return next(); } res.render( 'index', context ); }).catch( next ); } module.exports = getHomepage;
View the code on Gist.
🦏 Authentication Cooked In
For this setup, you’ll also need to authenticate your Node.js app by giving it the authentication data, which along with wpapi can be processed like this. Beware this is not always a best practice if you don’t use correct permissions and environment variables settings.
var WP = require( 'wordpress-rest-api' ); var _ = require( 'lodash' ); var config = _.pick( require( './config' ).wordpress, [ // Whitelist valid config keys 'username', 'password', 'endpoint' ]); var wp = new WP( config ); module.exports = wp;
View the code on Gist.
🦁 Site Content Accumulation
And finally, you are able to consume all the content by creating a content service which handles recursively fetching:
All the pages of a paged collection.
Your WordPress site’s info.
An alphabetized list of categories.
A specific category (specified by slug) from the content cache.
An alphabetized list of tags.
A specific tag (specified by slug) from the content cache
Other content required to have some feature parity with WP.
The code for this looks somewhat like this.
'use strict'; var wp = require( './wp' ); var cache = require( './content-cache' ); var _ = require( 'lodash' ); var RSVP = require( 'rsvp' ); /** * Recursively fetch all pages of a paged collection * * @param {Promise} request A promise to a WP API request's response * @returns {Array} A promise to an array of all matching records */ function all( request ) { return request.then(function( response ) { if ( ! response._paging || ! response._paging.next ) { return response; } // Request the next page and return both responses as one collection return RSVP.all([ response, all( response._paging.next ) ]).then(function( responses ) { return _.flatten( responses ); }); }); } function siteInfo( prop ) { var siteInfoPromise = cache.get( 'site-info' ); if ( ! siteInfoPromise ) { // Instantiate, request and cache the promise siteInfoPromise = wp.root( '/' ).then(function( info ) { return info; }); cache.set( 'site-info', siteInfoPromise ); } // Return the requested property return siteInfoPromise.then(function( info ) { return prop ? info[ prop ] : info; }); } /** * Get an alphabetized list of categories * * All archive routes display a sorted list of categories in their sidebar. * We generate that list here to ensure the sorting logic isn't duplicated * across routes. * * @method sortedCategories * @return {Array} An array of category objects */ function sortedCategories() { return all( wp.categories() ).then(function( categories ) { return _.chain( categories ) .sortBy( 'slug' ) .value(); }); } function sortedCategoriesCached() { var categoriesPromise = cache.get( 'sorted-categories' ); if ( ! categoriesPromise ) { categoriesPromise = sortedCategories(); cache.set( 'sorted-categories', categoriesPromise ); } return categoriesPromise; } /** * Get a specific category (specified by slug) from the content cache * * The WP API doesn't currently support filtering taxonomy term collections, * so we have to request all categories and filter them down if we want to get * an individual term. * * To make this request more efficient, it uses sortedCategoriesCached. * * @method categoryCached * @param {String} slug The slug of a category * @return {Promise} A promise to the category with the provided slug */ function categoryCached( slug ) { return sortedCategoriesCached().then(function( categories ) { return _.findWhere( categories, { slug: slug }); }); } /** * Get a specific tag (specified by slug) from the content cache * * The WP API doesn't currently support filtering taxonomy term collections, * so we have to request all tags and filter them down if we want to get an * individual term. * * To make this request more efficient, it uses the cached sortedTags promise. * * @method tagCached * @param {String} slug The slug of a tag * @return {Promise} A promise to the tag with the provided slug */ function tagCached( slug ) { return sortedTagsCached().then(function( tags ) { return _.findWhere( tags, { slug: slug }); }); } /** * Get an alphabetized list of tags * * @method sortedTags * @return {Array} An array of tag objects */ function sortedTags() { return all( wp.tags() ).then(function( tags ) { return _.chain( tags ) .sortBy( 'slug' ) .value(); }); } function sortedTagsCached() { var tagsPromise = cache.get( 'sorted-tags' ); if ( ! tagsPromise ) { tagsPromise = sortedTags(); cache.set( 'sorted-tags', tagsPromise ); } return tagsPromise; } function getSidebarContent() { return RSVP.hash({ categories: sortedCategoriesCached(), tags: sortedTagsCached() }); } module.exports = { // Recursively page through a collection to retrieve all matching items all: all, // Get (and cache) the top-level information about a site, returning the // value corresponding to the provided key siteInfo: siteInfo, sortedCategories: sortedCategories, sortedCategoriesCached: sortedCategoriesCached, categoryCached: categoryCached, tagCached: tagCached, sortedTags: sortedTags, sortedTagsCached: sortedTagsCached, getSidebarContent: getSidebarContent };
View the code on Gist.
🛠 Custom Routes & Sales Data
Finally, I have cooked in quite a few custom routes from where I can attain any kind of sales related data. For the particular architecture I have in place, I’m again using the RSVP.hash utility for convenience and parallelism. It works like a charm.
The post Build Custom Dashboards with MongoDB, Azure & Serverless Functions appeared first on SitePoint.
by Ahmad Awais via SitePoint https://ift.tt/2u9J7bb
0 notes
mbaljeetsingh · 6 years
Text
A Custom WordPress Dashboard with MongoDB Atlas, Microsoft Azure, & Serverless Functions!
TL;DR I’m building a custom WordPress dashboard for an enterprise client which is powered by React.js on top of Node.js with MongoDB Atlas as the database.
This dashboard uses several Microsoft Azure services, e.g., Cognitive Services, Azure App Services, and especially serverless ⚡ Azure Functions. In this post, you’ll learn how to build a small module from it and the reason behind my choice of stack, apps, and products.
🚀 One of my enterprise clients who owns a huge networking and media company has a large-scale WordPress site set up. He recently consulted me about the possibility of building a custom WordPress dashboard (based on the WordPress REST API) — to help him make intelligent business decisions via Machine Learning and Artificial Intelligence.
🤔 With JavaScript eating up the world and WordPress adapting to the move by creating the Gutenberg project — I thought of an architecture/stack where WordPress would be our content layer, a familiar battle-tested environment that does its job well with a custom dashboard that’s built with JavaScript.
😲 When you’re tasked to build a modern JavaScript application, you find yourself in a mix of different frameworks, tools, and dev-workflows. The JavaScript eco-system has grown a lot over the last couple of years. We have many many good options available today.
🎟 So, after researching my options for a bit — I opted to use React.js on top of Node.js to start building the custom WordPress dashboard. While the project is in its ideation phase at the moment, I think it’s important that I share some of our goals here to define context behind my choice of the stack.
Custom WordPress Dashboard Goals
Imagine you own a large networking company where over 500 hotels (in three different countries) use your services to power their conference halls, IT meetings, and online property management like the sites and blogs. That’s what my client does.
Most of this is powered by a huge multi-site WordPress instance that manages everything for the hotels, websites, online booking, registrations, events, tickets, reviews, and comments. There’re also other systems running different software which are able to produce content via REST API.
We’ve set out to create a custom WordPress dashboard with many goals in mind but I’m listing a few of them which are related to this particular article. Take a look at what I have built so far, it’s all based on serverless Azure functions — which are pretty awesome.
👀 High-level Data Reporting
The custom dashboard will report all the high-level data, e.g. things like live sales happening throughout my client’s portfolio (500+ hotels), entity/time based and date based breakdowns.
And that how each of his franchise performing on a daily, weekly, monthly basis. All of this data is being fed to MongoDB Atlas. More on that later.
⚡Serverless Automation
Most of the modules are to built upon serverless architecture — which in this case provides huge benefits. All the automation is always running and the cost is paid as you go i.e. pay for what you use.
An initial rough estimate puts this solution 34% more economical than having a server VM running all the time. We are using Azure Functions for this serverless automation.
🔥 IoT (Internet of Things) Hub
There are about ~200 IT managers working for my client who have IoT enabled devices that feed data into several online projects. This custom dashboard also includes that data for making better decisions and connecting the whole registration, management, maintenance team’s hub into a single place.
As you might have already guessed, this project makes use of IoT Hub from Microsoft Azure to connect, monitor, and manage all of the IoT assets.
🤖 Machine Learning and Artificial Intelligence
We’re using a lot of different services from Microsoft Azure for the sake of making this dashboard artificially intelligent by Machine Learning.
There’s a huge dataset that is fed to the ML Studio which later helps us predict different decisions like space management, low registrations trends for IT events, and questions like why and when these things happen.
While the Machine learning part is beyond the scope of this article, I still plan to touch the base with some of the awesome Artificial intelligence I’ve been able to cook in via Azure’s Cognitive Services.
🕰 Live & Real Time
One of the most important aspects of this custom dashboard is that it’s live and real time. Which means, I need a managed database that can cope with this amount of data and still stay highly available.
But at the same time, it’s for the management purposes and doesn’t need to have any impact on the WordPress sites. That is a crucial system design decision for this dashboard.
By that what I mean is we can do all sorts of experiments with this custom dashboard but it shouldn’t have any impact on the database/servers which are running the multi-site WordPress instance.
MongoDB & MongoDB Atlas
For this custom WordPress dashboard, I am using MongoDB Atlas as a DBaaS (Database as a Service). And I couldn’t be happier. When I first shared that I’ll be using MongoDB, many developers had concerns. Most of the questions were about why add another layer of complexity by adding yet another database to the mix. Why not use the WordPress database as it is. To answer these questions and more I have prepared a list of reasons to why I am using MongoDB Atlas.
♨ Dislike for RDBMS
I personally dislike relational databases. Most of the times, for me they get in the way of building applications. I have to completely get out of the app I am building, think about my database in future and design a good schema which always ends up in a bad exercise for my dev-workflow. It’s counter-intuitive at best — at least for me, it is.
💸 HDD Is Cheap | CPU/RAM Is Not
Old databases were mostly designed in a way to save disk space among other things. This led to a plethora of problems like normalization, indexing, and made sharding, auto-scaling, and replication harder.
Nowadays, disk space is dirt-cheap. On the other hand, CPU/RAM is not and your sys-admin cost can skyrocket very quickly if you end up with a bad choice here.
Like you wanted to create a custom dashboard but your system design architect cost you two sys-admins with how he chose to design your system. Similarly, my client wanted a managed solution without having to hire a team of IT/DevOps folks — at least for an experimental custom dashboard.
🍀 MongoDB’s Pro
Schema-less. Flexible schema for-the-win. I don’t have to change anything, my regular app-dev workflow, creating a Node.js based app where I am manipulating with JSON type data, I can just feed that into MongoDB and it just works.
Workflow-consistency. Creates documents the way my custom dashboard is represented. Sales, Videos, Talks, Comments, Reviews, Registrations, etc. all of that have similar data representation on the frontend and the backend — and even in the database. I manage 3rd party data via middleware. This consistency translates to clean code.
Ease of scale-out. It scales reads by using replica sets. Scales writes by using sharding (auto balancing). Just fire up another machine and away you go. Most importantly, instead of vertical scaling via RDBMS, MongoDB lets you scale horizontally with different levels of consistency. That’s a big plus. ➕
Cost. Depends on which RDBMS of course, but MongoDB is free and can run on Linux, ideal for running on cheaper commodity kit.
🍃 Why MongoDB Atlas?
Well, now that I know MongoDB is the right database choice, there are so many different options to host your database. I can self-host on my Linux machine via DigitalOcean, use a cloud provider like AWS/Azure or a choose a DBaaS service specific to MongoDB.
But I want a fast, secure, and managed MongoDB solution that I can easily scale with the growth of the number of modules we attach in this custom WordPress dashboard. That’s MongoDB Atlas.
MongoDB Atlas is a cloud-hosted MongoDB service engineered and run by the same team that builds the database. And guess what, I trust that they follow the best operational practices since they are the ones who’re building MongoDB in the first place.
I want this custom dashboard to be self-managed, serverless, and using MongoDB Atlas saves me from worrying about software patching, backups, and reliable configuration setup for new DB updates. Again a big plus. ➕
Also, the fact that MongoDB Atlas is supported cross-platform as well as cross region and across different cloud providers makes it a much better choice. I think each Cluster comes with two replica sets, ready to scale.
🔋 MongoDB Compass
Now that we are going to work with MongoDB, it’d be great to have a tool through which we can explore our database, view the changes, debug and what not. For this purpose, MongoDB again takes the lead with a product called MongoDB Compass. Take a look.
I suggest that you go ahead and download MongoDB Compass. It’s literally the best tool to visualize your MongoDB database. Here’s a set of features:
Visualize and explore: Take a look at your database, find out how things are looking, and even visualize stuff like maps/coordinates.
Insert, modify, and delete: You can also perform CRUD operations for your DB right from MongoDB compass. Makes testing easier.
Debug and optimize: Finally, analyze your data, debug it and even find out about performance issues right inside a great GUI for your database. This tool is a must-have if you work with MongoDB.
Extensible: And the best part is you can build your own plugins to extend MongoDB Compass. Here’s the documentation on building your own Compass plugins.
Enterprise Flavor: MongoDB Compass comes in a few flavors: Community (Free), and Enterprise (Licensed) — the Enterprise version is the one that lets you visualize DB schema.
✅ Getting Started with MongoDB Atlas
Let’s get started and build a simple module which’s part of the custom WordPress dashboard that I am building. For this module, we are collecting all the sales related data. For that, we need a MongoDB instance, and we’re of course using MongoDB Atlas here.
Step #1: Go to MongoDB Atlas →
Go to the MongoDB Atlas site and register a completely free MongoDB instance hosted on AWS, with shared RAM and 512 Mb storage. Click the Get Started Free button.
Step #2: Sign up at MongoDB Atlas →
Now go ahead and sign up with your email ID and fill up the details. It’s amazing that you can sign up and use a free MongoDB Atlas hosted DB instance and they don’t even require you to add a credit-card for that.
Step #3: Create the Cluster
Now you’ll be redirected to a page with a bunch of information about the new MongoDB Cluster you’re about to create. I suggest that you review this information, and move ahead by clicking the Create Cluster button at the bottom just like in the screenshot below.
Step #4: Create DB User/Pass
It’ll take a minute and your DB will be created. Once that happens, head over to the Security > MongoDB Users and click on the + ADD NEW USER button on the right, to create a new user for your database. Let’s keep all the other settings set to default for the sake of this intro-article.
I’m setting the user/pass as usermongo but you know better.
Step #5: Add IP to Whitelist for Access
To be able to access your MongoDB Atlas database, you need to setup the IP Whitelist with the IP of your server where your app is hosted. Authentication is beyond what I am discussing here so for the purpose of this demo let’s just allow everyone (which’s actually a bad practice in production).
So, again, head over to the Security > IP Whitelist and click on the + ADD IP ADDRESS button on the right, and finally ALLOW ACCESS FROM ANYWHERE button to allow the anonymous access.
Step #6: Connect via MongoDB Compass
Now that our DB’s IP access and a user has been created, we can pick up the connection string and use it to connect to our database with our MongoDB Compass application.
Go to Connect then choose Connect with MongoDB Compass and download compass if you haven’t till now, copy the URI Connection String. Finally, open Compass and it should be able to detect the connection string in your clipboard, allow it to connect to your database.
And you are set to visualize your database, analyze its performance, and even run complete CRUD operations. Awesome! 💯
Now that we have created a MongoDB Atlas, connected it with MongoDB Compass, we can move forward and start building our Node.js application.
WordPress REST API — FTW!
This WordPress based Node.js custom dashboard interacts with the WordPress instance via WordPress REST API. Since this is a Node.js app, I am using an awesome library called wpapi written by K Adam White. He has also built a demo express based WordPress app. That’s what I got inspired by while building this custom dashboard. SO, you’ll see a lot of it here.
🚀 WordPress Custom Router Based on express
The router is set up with express. Here’s a basic error handler and router template for using WordPress with express.
'use strict'; var express = require('express'); var router = express.Router(); var siteInfoMiddleware = require('../middleware/site-info'); // Set global site info on all routes router.use(siteInfoMiddleware); // Public Routes // ============= router.get('/', require('./index')); router.get('/page/:page', require('./index')); router.get('/:slug', require('./single')); router.use('/tags/:tag', require('./tag')); router.use('/categories/:category', require('./category')); // Catch 404 and forward to error handler. router.use(function (req, res, next) { var err = new Error('Not Found'); err.status = 404; next(err); }); // Error Handling // ============== // Development error handler will print stacktrace. function developmentErrorRoute(err, req, res, next) { res.status(err.status || 500); res.render('error', { message: err.message, error: err }); } // Production error handler. No stacktraces leaked to user. function friendlyErrorRoute(err, req, res, next) { res.status(err.status || 500); res.render('error', { message: err.message, error: {} }); } // Configure error-handling behavior if (router.get('env') === 'development') { router.use(developmentErrorRoute); } else { router.use(friendlyErrorRoute); } module.exports = router;
🎚 Basic express Based Implementation
I am not hosting this entire thing on WordPress, but the initial plan was to do just that. If you want to go do that, you’d wanna build the index by querying all the info using the RSVP.hash utility for convenience and parallelism. For that here’s what you should do.
'use strict'; var wp = require( '../services/wp' ); var contentService = require( '../services/content-service' ); var pageNumbers = require( '../services/page-numbers' ); var pageTitle = require( '../services/page-title' ); var RSVP = require( 'rsvp' ); function getHomepage( req, res, next ) { var pages = pageNumbers( req.params.page ); RSVP.hash({ archiveBase: '', pages: pages, title: pageTitle(), // Primary page content posts: wp.posts().page( pages.current ), sidebar: contentService.getSidebarContent() }).then(function( context ) { if ( req.params.page && ! context.posts.length ) { // Invalid pagination: 404 return next(); } res.render( 'index', context ); }).catch( next ); } module.exports = getHomepage;
🦏 Authentication Cooked In
For this setup, you’ll also need to authenticate your Node.js app by giving it the authentication data, which along with wpapi can be processed like this. Beware this is not always a best practice if you don’t use correct permissions and environment variables settings.
var WP = require( 'wordpress-rest-api' ); var _ = require( 'lodash' ); var config = _.pick( require( './config' ).wordpress, [ // Whitelist valid config keys 'username', 'password', 'endpoint' ]); var wp = new WP( config ); module.exports = wp;
🦁 Site Content Accumulation
And finally, you are able to consume all the content by creating a content-service which handles recursively fetching
All the pages of a paged collection.
Your WordPress site’s info.
An alphabetized list of categories.
A specific category (specified by slug) from the content cache.
An alphabetized list of tags.
A specific tag (specified by slug) from the content cache
Other content required to have some feature parity with WP.
The code for this looks somewhat like this.
'use strict'; var wp = require( './wp' ); var cache = require( './content-cache' ); var _ = require( 'lodash' ); var RSVP = require( 'rsvp' ); /** * Recursively fetch all pages of a paged collection * * @param {Promise} request A promise to a WP API request's response * @returns {Array} A promise to an array of all matching records */ function all( request ) { return request.then(function( response ) { if ( ! response._paging || ! response._paging.next ) { return response; } // Request the next page and return both responses as one collection return RSVP.all([ response, all( response._paging.next ) ]).then(function( responses ) { return _.flatten( responses ); }); }); } function siteInfo( prop ) { var siteInfoPromise = cache.get( 'site-info' ); if ( ! siteInfoPromise ) { // Instantiate, request and cache the promise siteInfoPromise = wp.root( '/' ).then(function( info ) { return info; }); cache.set( 'site-info', siteInfoPromise ); } // Return the requested property return siteInfoPromise.then(function( info ) { return prop ? info[ prop ] : info; }); } /** * Get an alphabetized list of categories * * All archive routes display a sorted list of categories in their sidebar. * We generate that list here to ensure the sorting logic isn't duplicated * across routes. * * @method sortedCategories * @return {Array} An array of category objects */ function sortedCategories() { return all( wp.categories() ).then(function( categories ) { return _.chain( categories ) .sortBy( 'slug' ) .value(); }); } function sortedCategoriesCached() { var categoriesPromise = cache.get( 'sorted-categories' ); if ( ! categoriesPromise ) { categoriesPromise = sortedCategories(); cache.set( 'sorted-categories', categoriesPromise ); } return categoriesPromise; } /** * Get a specific category (specified by slug) from the content cache * * The WP API doesn't currently support filtering taxonomy term collections, * so we have to request all categories and filter them down if we want to get * an individual term. * * To make this request more efficient, it uses sortedCategoriesCached. * * @method categoryCached * @param {String} slug The slug of a category * @return {Promise} A promise to the category with the provided slug */ function categoryCached( slug ) { return sortedCategoriesCached().then(function( categories ) { return _.findWhere( categories, { slug: slug }); }); } /** * Get a specific tag (specified by slug) from the content cache * * The WP API doesn't currently support filtering taxonomy term collections, * so we have to request all tags and filter them down if we want to get an * individual term. * * To make this request more efficient, it uses the cached sortedTags promise. * * @method tagCached * @param {String} slug The slug of a tag * @return {Promise} A promise to the tag with the provided slug */ function tagCached( slug ) { return sortedTagsCached().then(function( tags ) { return _.findWhere( tags, { slug: slug }); }); } /** * Get an alphabetized list of tags * * @method sortedTags * @return {Array} An array of tag objects */ function sortedTags() { return all( wp.tags() ).then(function( tags ) { return _.chain( tags ) .sortBy( 'slug' ) .value(); }); } function sortedTagsCached() { var tagsPromise = cache.get( 'sorted-tags' ); if ( ! tagsPromise ) { tagsPromise = sortedTags(); cache.set( 'sorted-tags', tagsPromise ); } return tagsPromise; } function getSidebarContent() { return RSVP.hash({ categories: sortedCategoriesCached(), tags: sortedTagsCached() }); } module.exports = { // Recursively page through a collection to retrieve all matching items all: all, // Get (and cache) the top-level information about a site, returning the // value corresponding to the provided key siteInfo: siteInfo, sortedCategories: sortedCategories, sortedCategoriesCached: sortedCategoriesCached, categoryCached: categoryCached, tagCached: tagCached, sortedTags: sortedTags, sortedTagsCached: sortedTagsCached, getSidebarContent: getSidebarContent };
🛠 Custom Routes & Sales Data
Finally, I have cooked in quite a few custom routes from where I can attain any kind of sales related data. For the particular architecture I have in place, I’m again using the RSVP.hash utility for convenience and parallelism. It works like a charm.
var WPAPI = require( 'wpapi' ); var RSVP = require('rsvp'); // Using the RSVP.hash utility for convenience and parallelism RSVP.hash({ categories: wp.categories().slug( 'it-services' ), tags1: wp.tags().slug('hotel-name'), tags2: wp.tags().slug('march-events') }).then(function( results ) { // Combine & map .slug() results into arrays of IDs by taxonomy var tagIDs = results.tags1.concat( results.tags2 ) .map(function( tag ) { return tag.id; }); var categoryIDs = results.categories .map(function( cat ) { return cat.id; }); return wp.posts() .tags( tags ) .categories( categories ); }).then(function( posts ) { // These posts are all fiction, either magical realism or historical: console.log( posts ); });
Once I have that data, I am sending it to Paddle.com for processing along with the purchased order request so that it can be added to our MongoDB instance via serverless ⚡ Azure Functions.
// Registering custom routes. site.itSales = site.registerRoute( 'sales/v1', '/resource/(?P<some_part>\\d+)' ); site.itSales().somePart( 7 ); // => myplugin/v1/resource/7 // Query Parameters & Filtering Custom Routes. site.handler = site.registerRoute( 'sales/v1', 'receipts/(?P<id>)', { // Listing any of these parameters will assign the built-in // chaining method that handles the parameter: params: [ 'before', 'after', 'author', 'parent', 'post' ] }); // Yields from the custom data of buyers. site.handler().post( 8 ).author( 92 ).before( dateObj )... // Sent to paddle.
It might look odd to some but WordPress allows you to set up custom post types and custom taxonomies which is what I’m using here, the above code, however, is not the exact implementation but a similar approach to what I have used via categories and tags.
This data gets sent Paddle, and it’s heavily cached so that our WordPress instances do not get any sort of load while we experiment with the custom dashboard. I’ve also cooked in a small data-refresh module which fetches the data on demand from the WordPress instance of choice.
Microsoft Azure & Azure Functions
While building this custom WordPress dashboard, I wanted to make sure that each module of this dashboard lived in form of a serverless app with multiple serverless functions. This decision was based on keeping this dashboard’s cost as economical as possible.
👀 Three Options
There are three major cloud services providers present. That are Microsoft Azure, Google Cloud Platform, and Amazon Web Services. Each of which has serverless functions available which are called Azure functions, GCP Cloud Functions, and AWS Lambdas.
📘 Choosing Azure
Azure has one of the biggest cloud architecture and global presence. 50 Azure regions, more than any cloud provider and after testing each of these three, I found that Azure functions had the best response time in UAE (as my client’s business is based out of UAE).
Also, the fact that we’re using Azure ML Studio, AI Cognitive Services, and Virtual Machines to host parts of this project, it made complete sense to use Azure functions for the serverless architecture.
Getting Started with Azure Functions
Let’s get started with Azure functions. I am going to take you through the process of creating a simple serverless Azure function, which will be triggered via HTTP requests, and inside it, we’ll process the sales information sent to us from Paddle.com.
⚙ What are we building?!
I am building a serverless Azure function which is based on JavaScript and specifically Node.js code.
This Azure function will get triggered by a simple GET HTTP request from our 3rd party payment solution, i.e., Paddle.com
As soon as there’s a sale on Paddle.com, it will trigger a webhook that contains info related to our sale, quantity, item, earnings, and some member-related data that WordPress sent to Paddle.
Using WordPress REST API, I have added some custom data related to the user who purchased the product, like user’s ID in WordPress DB, which WordPress site had this sale, and such user’s meta info.
When Azure function receives this GET request, it processes the info, takes out what I need to keep in the MongoDB Atlas Cluster and forms a JavaScript object ready to be saved in the DB.
The azure function then connects to MongoDB Atlas instance via an npm package called mongoose, where after connecting the database, I create a DB Model/Schema, and then this data is saved to the MongoDB Atlas Cluster.
After which Azure function kind of sits there waiting for next sale to happen, where my client only pays for the execution time and amount of executions for Azure functions. (1 million of which are free every month 😮).
Now, this is only a high-level summary of what’s happening, there’s a lot of steps that I skipped here like authentication which is beyond the scope of this article. You should always setup authentication and verification to keep things civil and avoid any overage.
So, let’s go ahead and build this thing.
Step #1: Set up Microsoft Azure & VSCode
I expect you to have the Azure account set up on your end. You’ll need to subscribe with a credit-card since we need storage for hosting the Node.js files which will be used with Azure Functions and you have to pay for storage (you’ll probably get a free $200 credit for the first month and even after that the cost is quite pretty low).
So, go ahead and set up the following:
✅ Setup a Microsoft Azure account with a credit card in billing.
✅ Install Visual Studio Code (Psst. I’m making a course on VSCode).
✅ Install the Azure Functions extension on your VSCode.
💡 To enable local debugging, install the Azure Functions Core Tools.
🗂 Create a new directory and open it up in VSCode.
In case you’re wondering which theme and font I am using, it’s Shades of Purple 💜 — for more info see which software and hardware I use.
Step #2: Create a New Function App Project
Now let’s create a new function app project. This is really easy with VSCode. All you have to do is go-to the Azure Extension explorer present in the activity bar. From there access FUNCTIONS tab and click on the first Create New Project icon.
This will create a demo project, with basic files required to get started and will initialize a Git repo for you. I’ll keep up with small gif based demos to make things easier for you.
Step #3: Create an HTTP-triggered Azure Function
Now that we have created a function app project, let’s create an HTTP-triggered serverless Azure function. For that, go-to the Azure Extension explorer present in the activity bar. From there access FUNCTIONS tab and click on the second icon Create Function.
For the sake of this demo, I am choosing to keep the authentication part simple so going to select anonymous access. The name of our Azure function is HttpTriggerJS so you can find a new directory created with that name inside your project. This should contain two files i.e. functions.json and index.js
⚡ A function is a primary concept in Azure Functions. You write code for a function in a language of your choice and save the code and configuration files in the same folder.
🛠 The configuration is named function.json, which contains JSON configuration data. It defines the function bindings and other configuration settings. The runtime uses this file to determine the events to monitor and how to pass data into and return data from function execution. Read more on this file in the official documentation here.
Following is an example function.json file that gets created.
{ "disabled": false, "bindings": [ { "authLevel": "anonymous", "type": "httpTrigger", "direction": "in", "name": "req" }, { "type": "http", "direction": "out", "name": "res" } ] }
And then, there’s an index.js file which contains a basic code that you can use to test your Azure function. It receives a parameter name and prints it back to you or shows you an error asking for this parameter.
module.exports = function (context, req) { context.log('JavaScript HTTP trigger function processed a request.'); if (req.query.name || (req.body && req.body.name)) { context.res = { // status: 200, /* Defaults to 200 */ body: "Hello " + (req.query.name || req.body.name) }; } else { context.res = { status: 400, body: "Please pass a name on the query string or in the request body" }; } context.done(); };
Step #4: Deploy & Test Your Azure Function
Now that we have created an Azure function which can be triggered by GET HTTP request, let’s go ahead and deploy it with VSCode and test it with Postman API Explorer.
To deploy the function go-to the Azure Extension explorer present in the activity bar. From there access FUNCTIONS tab and click on the third icon Deploy to Function App.
This will ask you a bunch of questions about what is the name of your app, use anything unique. I used demo-wp-mdb-azure— VSCode then use this to create a resource group, to group together your function-app related resources, it’s storage (used to save the files), and the created Azure function — finally responding us back with a public URL.
I then went ahead to access this URL and it asked for the name param as per the code then when I sent the name param with the Postman app, it responded with Hello Ahmad Awais. 👍
VSCode also asked me to update the function extension app verions to beta, and I chose yes — coz that will help me use Node.js v8 for async/await.
Step #5: Create package.json and Install mongoose
Now that our Azure function is up and running. Let’s create a package.json file in the root of our project and install mongoose. We’ll need this to connect and save data to our MongoDB Atlas Cluster.
Mongoose provides a straight-forward, schema-based solution to model your application data. It includes built-in typecasting, validation, query building, business logic hooks and more, out of the box. It’s pretty awesome. 💯
Step #6: Add App Setting for MongoDB Connection
Now we are almost ready to start writing code for our application. But before doing that, we’ll need a connection string to be able to connect to our MongoDB Atlas Cluster (just like we did with MongoDB Compass). This connection string is private and you shouldn’t commit it to the git repo.
💯 This connections string belongs to the local.settings.json file in the project root. Let’s first download the settings, then add MongodbAtlas setting with our connection string (get this string from the MongoDB Atlas dashboard) and upload the app settings.
To do this, go-to the Azure Extension explorer present in the activity bar. From there access FUNCTIONS tab and select your subscription, then your Azure function app, i.e., demo-wp-mdb-azure and then right click Application Settings to select Download remote settings… to download and Upload local settings… to upload the settings after adding the MongodbAtlas connection string to the settings.
Step #7: Update Node Version of Azure Function
In the code, I intend to use async/await which are not available on v6.5.0 of Node.js that comes with the default version 1 of Azure functions. In the step #4, VSCode asked me to update to the runtime version of Azure function to beta and I did that. This enables support for latest Node.js versions on Azure functions.
So, let’s just update WEBSITE_NODE_DEFAULT_VERSION app setting in our local settings and update that to the remote settings.
Step #8: Create MongoDB Model/Schema
Before we save any data to our MongoDB Atlas Cluster, let’s create a modelSale.js file that will contain the model’s schema for what we intend to save in the database. It’s an extremely simple schema implementation, I suggest you read up on what you can do here with [mongoose](http://mongoosejs.com/docs/guide.html) and MongoDB.
This file is pretty much self-explanatory.
/** * Model: Sale */ const mongoose = require('mongoose'); mongoose.Promise = global.Promise; // Sale Schema. const saleSchema = new mongoose.Schema({ sale_gross: Number, earnings: Number, currency: String, memberSince: Date, customerEmail: String, event_time: { type: Date, default: Date.now }, }); // Export the model. module.exports = mongoose.model('Sale', saleSchema);
Step #9: Code the ⚡Azure Function with Node.js
Now let’s code our Azure function. I’m adding all the main code lives inside the index.js file for the purpose of this demo. Also going to use the context object as the first parameter, make sure you read about that. Everything else is explained in the code snippet below.
So, this is just a demo code for this article. It does the following:
✅ Gets the data from Paddle.com
⚡ Connects to the MongoDB Atlas via connection string that we added in our Application Settings.
📘 Uses the defined DB schema inside the test database where it creates a sales collection including documents for our sales.
⚙ Validates the data and creates a finalData object that gets saved in the MongoDB Atlas Cluster. Yay!!!
🥅 Finally, responds to the Paddle webhook with 200 status code if all goes well, and does the context.done() dance.
Everything is pretty much explained with inline documentation.
/** * Azure Function: Sale. * * Gets data from Paddle.com (which in turn gets data * from WordPress) and processes the data, creates a * finalData object and saves it in MongoDB Atlas. * * @param context To pass data between function to / from runtime. * @param req HTTP Request sent to the Azure function by Paddle. */ module.exports = async function (context, req) { // Let's call it log. const log = context.log; // Log the entire request just for the demo. log('[RAN] RequestUri=%s', req.originalUrl); /** * Azure function Response. * * Processes the `req` request from Paddle.com * and saves the data to MongoDB Atlas while * responding the `res` response. */ // Database interaction. const mongoose = require('mongoose'); const DATABASE = process.env.MongodbAtlas; // Connect to our Database and handle any bad connections mongoose.connect(DATABASE); mongoose.Promise = global.Promise; // Tell Mongoose to use ES6 promises mongoose.connection.on('error', (err) => { context.log(`ERROR→ ${err.message}`); }); // Sale Schema. require('./modelSale'); const Sale = mongoose.model('Sale'); // Create a Response. if (req.query.customFieldName) { // Simple authentication for the purpose of demo. // Build the data we need. const sale_gross = req.query.p_sale_gross || '0'; const earnings = JSON.parse(req.query.p_earnings)['16413'] || '0' const currency = req.query.p_currency || 'USD'; const memberSince = req.query.memberSince || new Date(); const customerEmail = req.query.customerEmail || ''; const event_time = new Date(); log('[OUTPUT]—— sale_gross: ' + sale_gross); log('[OUTPUT]—— earnings: ' + earnings); log('[OUTPUT]—— currency: ' + currency); const finalData = { sale_gross: sale_gross, earnings: earnings, currency: currency, memberSince: memberSince, customerEmail: customerEmail, event_time: event_time, } // Save to db. const sale = await (new Sale(finalData)).save(); log("[OUTPUT]—— SALE SAVED: ", sale); // Respond with 200. context.res = { status: 200, body: "Thank You for the payment! " + (req.query.customFieldName || req.body.customFieldName) }; } else { context.res = { status: 400, body: "Please pass a name on the query string or in the request body" }; } // Informs the runtime that your code has finished. You must call context.done, or else the runtime never knows that your function is complete, and the execution will time out. // @link: https://docs.microsoft.com/en-us/azure/azure-functions/functions-reference-node#contextdone-method context.done(); };
Step #10: Re-Deploy The Azure Function
Now let’s re-deploy the Azure function. For that, go-to the Azure Extension explorer present in the activity bar. From there access FUNCTIONS tab and click on the third Deploy to Function App icon.
>
Step #11: Test Azure Function via Paddle’s Webhook
Looks like we’re pretty much done. All that’s left is to test our Azure function by triggering a dummy webhook via Paddle.com. Let’s do that. Also, when things do work, let’s explore how our data looks in the MongoDB Compass.
Wow, Humph!!! That was a lot. Glad it worked. 🎉
🤔 So, What Just Happened?!
Prepare yourself for a mouthful. I created a small part of the Sales module in the custom WordPress Dashboard app that I am building. I used MongoDB Atlas and Compass, then created Microsoft ⚡Azure Function via Function App with VSCode, deployed the app with env secret as application string with the MongoDB connection string, updated the Node.js version and triggered the function via a dummy webhook from Paddle.com (like it will trigger when a sale happens) to send data (from Paddle + WordPress) to our Azure function and from there to MongoDB Atlas. And it worked, haha!
Machine Learning & Artificial Intelligence
Machine learning and artificial intelligence are always a mesmerizing topic in the world of software technology but we don’t talk a lot about that in the context of WordPress or in the WP Community.
I set to change that by adding a few small improvements to a select few WordPress sites for my client and have every intention of exploring the same with this custom WordPress dashboard.
I have discussed this topic before and shared what I am working on, take a look at this small artificial intelligence plugin I am building for WordPress, and integrating it with different Azure Cognitive Services.
I explained this in a video in another post that you can find here: Building A WordPress Artificial Intelligence Plugin →
I’ve accomplished similar results in this dashboard by the wpapi package. First I upload the image to cognitive services and then on a confident response, I send it to WordPress to uploaded via WordPress REST API, with image description that gets generated by Computer Vision AI.
/** * Get Image Alt Recognition with Computer Vision * using Azure Cognitive Services. */ var WPAPI = require('wpapi'); var wp = new WPAPI({ endpoint: 'http://src.wordpress-develop.dev/wp-json' }); /** * Handle Image Alt Generation. */ function processImage() { // ********************************************** // *** Update or verify the following values. *** // ********************************************** // Replace <Subscription Key> with your valid subscription key. var subscriptionKey = "<Subscription Key>"; // You must use the same region in your REST call as you used to get your // subscription keys. For example, if you got your subscription keys from // westus, replace "westcentralus" in the URI below with "westus". // // Free trial subscription keys are generated in the westcentralus region. // If you use a free trial subscription key, you shouldn't need to change // this region. var uriBase = "https://westcentralus.api.cognitive.microsoft.com/vision/v2.0/analyze"; // Request parameters. var params = { "visualFeatures": "Categories,Description,Color", "details": "", "language": "en", }; // Display the image. var sourceImageUrl = document.getElementById("inputImage").value; document.querySelector("#sourceImage").src= sourceImageUrl; // Make the REST API call. $.ajax({ url: uriBase + "?" + $.param(params), // Request headers. beforeSend: function (xhrObj) { xhrObj.setRequestHeader("Content-Type", "application/json"); xhrObj.setRequestHeader( "Ocp-Apim-Subscription-Key", subscriptionKey); }, type: "POST", // Request body. data: '{"url": ' + '"' + sourceImageUrl + '"}', }) .done(function (data) { // Show formatted JSON on webpage. $("#responseTextArea").val(JSON.stringify(data, null, 2)); // Extract and display the caption and confidence from the first caption in the description object. if (data.description && data.description.captions) { var caption = data.description.captions[0]; if (caption.text && caption.confidence >= 0.5) { const imgDescription = caption.text; // ⬆️ Upload to WordPress. wp.media() // Specify a path to the file you want to upload, or a Buffer .file(sourceImageUrl) .create({ title: imgDescription, alt_text: imgDescription, caption: imgDescription, description: imgDescription }) .then(function (response) { // Your media is now uploaded: let's associate it with a post var newImageId = response.id; return wp.media().id(newImageId).update({ post: associatedPostId }); }) .then(function (response) { console.log('Media ID #' + response.id); console.log('is now associated with Post ID #' + response.post); }); } } }) .fail(function (jqXHR, textStatus, errorThrown) { // Display error message. var errorString = (errorThrown === "") ? "Error. " : errorThrown + " (" + jqXHR.status + "): "; errorString += (jqXHR.responseText === "") ? "" : jQuery.parseJSON(jqXHR.responseText).message; alert(errorString); }); };
👀 Content Moderation Automation
One of the ideas we have is to put AI/ML from Azure to use as a content moderation platform which offers a built-in human-in-the-loop + machine learning to help moderate images, text, and videos. It’s a work in progress but it’s something really interesting that you should definitely take a look at.
🕵 WordPress Grammar (Nazi) Intelligence
Y’all have a habit of typing the typos over and over again. I do that all the time. The coolest thing ever is when search engines like Bing and Google can spell check and proofread the search query for you.
What if WordPress had that?! — So, I got to work and ended up cooking the same functionality in the WordPress admin area for when you type a typo in your post title or more than one typos for what I care.
I was so excited that I couldn’t contain myself, so there’s me in the left bottom corner. All happy and surprised! 🙌👐👏👊💪🎶☝😌🎧
It’s Your Turn Now!
I really hope that you enjoyed this potential integration between all these modern JavaScript frameworks, AI/ML products, and serverless functions.
This project is a lot of fun. I think if you give this tech-stack a shot you can have this crazy amount of fun as well. So, leaving it up to you to try MongoDB Atlas, in the context of WordPress — and maybe attach all that to a bunch of serverless functions.
It’d mean a lot to me if you share this post on Twitter → Also feel free to say 👋 to me there at @MrAhmadAwais.
Peace! ✌
via Scotch.io https://ift.tt/2lZwsEh
0 notes