Tuesday, December 26, 2023

Trying to make 2024 the best year

 With 2024 round the corner.\

Below are the habits I plan to overcome/develop/cement 

Body, Mind, Soul , Relationship, Growth, Career, lifestyle

1) Dinner at 6 pm on weekdays. Not eating anything after dinner.

2) Waking up at 7:30 am everyday on weekdays.

3) Read 5 times prayers on time.

4) Avoid Chips/Candies/Carbonated drinks

5) Eat more veggies, nuts

6) No IG/FB/TK Reels mindless death scroll

7) Read 5 lines of Quran every day with meaning.

8) Read book - finish 1 book in 2 weeks

9) Do morning stretches

10) Play squash 3 times a week

11) Do 10-15 push-ups and lunges - set of 3

12) Spend 10-15 mins in DuoLingo - Italian

13) Read Globe and Mail 

14) Drink coffee at Home

15) Have more home-cooked meals -  create meal every weekend 

16) Be regular in Multi-vitamin pills

17) Build skills for DE: data bricks, python, PowerBI

18) Get (another) property - investment goals

19) Get (another) remote job - save money

20) Plan a trip to Norway/Denmark IA



Wednesday, January 18, 2023

48 hours...

     Addicted or hooked is an understatement right now. I have taken the binge-listening to insane level. I have been listening to "Kahani Suno 2.0" by Kaifi Khalil for 48 hours and counting. I have been hypnotized, enamored, enthralled, beguiled, spellbound with this song - you get the point!
I have been listening while working, driving , going to sleep, going for walk, even TV time is watching this song.

In short, this song has consumed me. The song is about a lover yearning for his lost love and how he is left heart broken in desolate state. The way Kaifi Khalil sang this song - his haunting vocals, simple yet deep lyrics takes it to immortal level. This level can only be achieved someone who has went through the 7 levels of human emotions in love - attraction, amusement, infatuation , adulation , passion, heartache and sorrow. 

This song is a rage as we speak. It has smashed all numbers in no time. A song like this needs no validation in terms of social media metrics. This phenomenon reminds me of how Atif Aslam's 'Adadat' burst into the music scene and his song had a similar theme and affect - simple and deep lyrics, soulful vocals and sigh of a lost love.

I have watched unplugged versions that has different variant in lyrics. Each rendering new flavor, touching different notes of emotions. Believe it or not - I had tears rolling out of my eyes on few occasions. Not I am remembering 'someone' in the past, its just the raw emotions of the song that just gravitates you to a different zone.

Looking forward to more gem from this young lad from Lyari in Karachi,Pakistan. Lyari is the last place you would expect an artist would emerge. It is really heartening, a flame has risen from the ashes ! If I may, this might be the Renaissance in Pakistani Music. Taking Pakistani Music beyond brand studios (Nescafe Basement , Coke Studio , etc)



Thursday, November 10, 2022

'Watch' the journey

 Contend with my apple watch I had absolutely no inclination, fascination, interest to whatsoever in wrist mechanical watches. It's been a 180-degree change in my attitude. To be in awe of the dials, to comment on watch types, to narrate the history of watch brands, to appreciate the movements and dials, having YouTube/Google feed filled with watch videos, checking out what watch a person is wearing at first look and more importantly thinking of watch as piece of art that retains (or appreciates) its value and brand prestige - it's been a fascinating watch journey so far. Loving this 'vice' and as often substituted for 'mid-life crisis1

It started at Istanbul airport in October 2021. I was there for my hair transplant - post on it for later. My friend Jaffar who is a huge watch lover and Omega fan too me to Omega store. That was probably the first time I stepped inside a watch boutique!

I started building my knowledge base with Omega - the history being worn by astronauts, brand ambassadors being George Clooney - but that the designs, models and brand perception resonated with me. I deliberately didn't invest in anytime learning about the obvious choice Rolex - wanted to steer clear away from the cliched path. Not that during pandemic it has become out of reach and selling multiples of MSRP but yea - that is also one of the main deterrents lol.

In embarking watch journey, I wanted to develop my taste and the motivation to write this post is to revisit and add more content and see how my taste has evolved ever since.

Sieko Kinetic - SQ50
This watch I received as a wedding gift from my Father in law in 2007. He had bought this watch back in 1992 when he was working in Saudi Arabia with the intention of giving it to his son-in-law aka Me one day :) This was my first proper respectable formal watch which I still love it. I still have the original box, warranty card, and manual.  

 


Movado Museum
This watch I again received it as a wedding anniversary gift from my wife in 2017. This was to commemorate our 10th wedding anniversary. This was my first entry-level luxury watch as you can say it. I enjoy wearing this watch. 
It is called Museum after the very first version was selected by New York's Museum of Modern Art for its permanent collection in 1959. A minimalist design defined by a single dot at 12 o'clock to symbolize the sun at high noon.



Tissot PRX80 Automatic

I brought this watch in Oct 2022 from Mirani Jewelers in Oakville Place Mall for 988.75 CAD. At this point I was well versed with the Swiss brands. Instead of shooting directly for the stars I got into watches with price range around 1k.

Tissot is a respectable brand with history and its new PRX80 Automatic which was reincarnated version of its Seastar design. PRX stands for Precision Robust and Roman numeral X or 10 – indicating 10 bar of water resistance. It has a power reserve for 80 hrs.
I love the blue waffle dial with integrated bracelet. The textured dial does give an impression of AP Royal Oak but that’s where the similarity ends.

 


Baltic HMS 002 
I was introduced to Baltic by Jaffar who bought the limited Baltic MR 002. Although I liked MR 002 but HMS 002 piqued my interest. It is a sector dial watch with leaf hands. I simply loved the dial with off-white color dial and blue leather straps. This is more casual watch. I got it from Baltic website and paid 539 CAD along with 72 Cad Import fees. This French watch is more for the design rather than movement. Movement is outsourced from Miyota 8315.

The dome is halite giving it a distorted look but never compromising the legibility of the time. 38 inch case size and 60 hr power reserve and 50m water resistance. This is watch has it all without breaking the bank. Its high-end competitor would be Longines Heritage Classic.



Hamilton Khaki Field Mechanical
This watch is a fun watch especially when you are having a 'field'' day. It is a well built and the manual wound movement is a pleasure to operate. 38mm in size with NATO strap - looks amazing with white and green contrast. I bought it in Summer 2023. It has a military heritage - chosen tool provider for soldiers during World War. Its water resistance is rated at 50m, not great but ideal for outdoor sports watch. It was founded in Pennsylvania during 1900s, its parent company being the swiss conglomerate - Swiss Watch Group. It is a reputable brand and fantastic value for money. I wear it everyday and love this piece. 



Tuesday, March 2, 2021

Inside the mind of a Product Manager

Recently read some articles, watched some YouTube videos on the roles and responsibilities of a Product Manager. I found it very fascinating to know and understand the mindset of a Product Manager. In a layman's term - he/she is a glorified Business Analyst with a Developer skillset with marketing acumen operating within the realms of legal framework by providing the maximum utility of a Product - or to put it even exalted level - CEO of a product.

A Product Manager is responsible for owning the product, have a vision of that product, and think of creative ways to work with developers, the marketing team, and any other peripheral teams (legal, security, design, communication, etc.) to help shape that vision to fruition.
Also, I feel that a Product Manager needs to be in touch with ground realities time and again. Working closely with a Product cab gives them tunnel vision and they feel very protective towards their creation.

As a Consultant I can connect in many ways a Product Manager navigates different hoops of obstacles and work for the greater good.

I have worked with many different teams, different team dynamics coupled with different technologies and domains in play. I have my eyes set on the prize - the delivery of the project which makes one immune to internal politics or any disruptions. Resolve to get things done, grit to endure hierarchical nomenclature, and navigate the organizational terrain. 

The focus should be on the delivery (launch) as well as sailing (land) of a Project.

Validation is one of the key indicators if processes are flowing smoothly and not having any bottlenecks or data integrity compromised. Having checkpoints in form of Reports to verify - data coming in - data processed - output result. In short, the landing of a space shuttle is equally important if not more than the launching of a space shuttle.





Supervised Learning : KNN Algorithm

K-Nearest Neighbors (KNN) Algorithm*

Motivation

The k-nearest neighbor algorithm, commonly known as the KNN algorithm, is a simple yet effective classification and regression supervised machine learning algorithm.

Classification is more intuitive to human learning. Since we were kids, our parents made sure we recognize objects , letters , people correctly. This method is formally known as supervised learning. In terms of notation, supervised learning is a function that maps an input to an output based on examples, input-output pairs

As a parent myself, when I tell my then 2-year old that a round object that bounces is a ball. ‘Round’ and ‘Bounce’ are features (input) and ‘Ball’ is variable (output)
This entire process can be called model training. Similarly, when the independent variable/output is a constant value, it is known as a regression problem.

What is the K-Nearest Neighbors (KNN) Algorithm?

The KNN algorithm is a classical machine learning algorithm that focuses on the distance from new unclassified/unlabeled data points to existing classified/labeled data points.

Figure 1: Colors of the three flower categories in our example dataset, red, yellow and green.

We already have labeled data points from our dataset. These data points belong to three categories represented by red, green, and yellow colors as shown above

Along comes a new ‘alien’ data point represented by the black cross mark and we need to determine which category this new data point belongs to from the three colors.

First, we assign ‘k’ a random value. The k-value tells the no. of nearest points to look for from the new unlabelled data point. Next, we calculate the distance from the unlabelled data point to every data point on the graph and select the top 5 shortest distances.

Figure 2: Measuring the distance between the categories from our dataset.
Amongst the top 5 nearest data points, It is evident that the new data point belongs to category red as most of its nearest neighbors are from category red.

Figure 3: Seeing how the new data point belongs to the category red from our example dataset.

Similarly, in a regression problem, the aim is to predict a new data point’s value. Based on a x-y graph, there are 2 features for each data point. The x-axis represents feature-1, and the y-axis represents feature-2.

We introduce a new data point for which only feature-1 value is known, and we need to predict feature-2 value. We start with a k-value of 5 and get the 5 nearest neighboring points from the new data point. The predicted value of feature-2 for the new data point is the mean of the feature-2 of 5 nearest neighbors.

Figure 4: Our example of introducing a new data point to illustrate the k-nearest neighbor (KNN) algorithm.

When to use KNN?

1.      KNN algorithm for applications that require high accuracy

2.      When the dataset is small.

3.      When data labeled correctly, and the predicted value will be among the given labels.

4.      KNN is used to solve regression, classification, or search problems

Pros and Cons of Using KNN

Pros

·        It is straightforward to implement since it requires only 2 param: k-value and distance function.

·        A good value of k will make the algorithm robust to noise.

·        It learns a nonlinear decision boundary.

·        There are almost no assumptions on the given data.

·        It is a non-parametric approach. No model fitting/training is required.

Cons

·        Inefficient for large datasets since distance needs to be calculated.

·        The model is susceptible to outliers.

·        It cannot handle imbalanced data. It needs to be handled explicitly.

·        If dataset requires large K value, it will increase the computation of the algorithm.

Math Behind KNN

T algorithm calculates the distance from the new data point to each existing data point. There are 2 common methods.

Figure 5: A representation of the Minkowski distance.

Minkowski Distance:

a) The Minkowski Distance is a generalized distance function in a normed vector space.

 

b) In cases,

1.      When p=1 — this is the Manhattan Distance

2.      When p=2 — this is the Euclidean Distance

Euclidean distance

the Euclidean distance between two points in the Euclidean space is the line segment’s length between the two points. We use the following formula:Figure 7: Euclidean distance formula.

Manhattan distance

The Manhattan distance calculation is similar to the calculation of Euclidean distance, with the difference being that we take absolute value instead of taking the square root of the sum of squared difference.

Fundamentally, the Euclidean distance represents “flying from one point to another,” and the Manhattan distance is “traveling from one point to another point” in a city following the pathway or the road.Figure 8: Manhattan distance formula. 

The common method is the Euclidean distance formula. One of the reasons being, Euclidean distance can calculate in any dimension, whereas Manhattan finds the elements on a vertical or a horizontal plane.

How to Choose the Right Value for K?

There is no standard statistical method to compute the most optimal K value. We want to choose a K value that will reduce errors. As we increase K, our predictions become more stable due to averaging or majority voting but then the error rate will increase as it will underfit the model. A small K yields a low bias and high variance (higher complexity), while a large K yields a high bias and a low variance. There are few different methods to try: Domain knowledge, Cross-Validation or Square Root

Implementation of KNN in Python

Using Iris dataset of Iris flowers of three related species: Setosa, Versicolor, Virginica. The observed features are sepal length, sepal width, petal length, and petal width.

import numpy as npimport
pandas as pdimport
matplotlib.pyplot as pltimport
seaborn as sns
from sklearn.model_selection import train_test_split
from sklearn.neighbors import KneighborsClassifier
from sklearn.model_selection import cross_val_score
from sklearn.datasets import load_iris
iris = load_iris()

The table below shows the first 5 rows of data in a Pandas DataFrame. The target values are 0.0, 1.0, and 2.0, representing Setosa, Versicolor, and Virginica, respectively.


Since KNN is sensitive to outliers and imbalanced data. Plotting a count plot for the target variable, there are 50 samples of each flower type. Thankfully, the data is perfectly balanced.

sns.countplot(x=’target’, data=iris



for feature in [‘sepal length (cm)’, ‘sepal width (cm)’, ‘petal length (cm)’, ‘petal width (cm)’]:
     sns.boxplot(x=’target’, y=feature, data=iris)
     plt.show()

 


Next, we split the data into training and testing sets to measure how accurate the model is. The model will be trained on the training set, which is randomly selected 60:40 of the original data. Before splitting it into training and testing sets, it is essential to separate the feature/dependent and target/independent variable.

X = iris.drop([‘target’], axis=1)
y = iris[‘target’]
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.4, random_state=0)

Building the initial model with a k-value of 1, meaning only 1 nearest neighbor will be considered for classifying a new data point. Internally, the distance from the new data point to all the data points will be calculated and then sorted ascending, along with their respective classes. Since the k-value is 1, the class (target value) of the first instance from the sorted array will determine the new data class. As we can see, we obtain a decent accuracy score of 91.6%. However, the optimal k-value needs to be selected.

knn = KNeighborsClassifier(n_neighbors=1)
knn.fit(X_train, y_train)
print(knn.score(X_test, y_test))
Output: 0.9166666666666666

To find the optimal k-value by cross-validation method, we calculate the k-values’ accuracy ranging from 1 to 26 in this case and choosing the optimal k-value.

k_range = list(range(1,26))
scores = []
for k in k_range:
    knn = KNeighborsClassifier(n_neighbors=k)
    knn.fit(X_train, y_train)
    y_pred = knn.predict(X_test)
    scores.append(metrics.accuracy_score(y_test, y_pred))
    plt.plot(k_range, scores)
    plt.xlabel(‘Value of k’)
    plt.ylabel(‘Accuracy Score’)
    plt.title(‘Accuracy Scores for different values of k’)
    plt.show()

 


Introducing a new unlabeled data point whose class we need to predict is the flower type, which belongs to a category. We will build the model with a k-value of 11.

knn = KNeighborsClassifier(n_neighbors=11)
knn.fit(iris.drop([‘target’], axis=1), iris[‘target’])
X_new = np.array([[1, 2.9, 10, 0.2]])
prediction = knn.predict(X_new)
print(prediction)
if prediction[0] == 0.0:
  print(‘Setosa’)
elif prediction[0] == 1.0:
  print(‘Versicolor’)
else: print(‘Virginica’)

Output: [2.]Virginica

KNN Applications

From forecasting epidemics and the economy to information retrieval ,recommender systems, data compression, and healthcare, the k-nearest neighbors (KNN) algorithm has become fundamental in such applications. KNN is primarily used during regression and classification tasks.

Conclusion

KNN is a highly effective and easy-to-implement supervised machine learning algorithm that can be used for classification and regression problems. The model functions by calculating distances of a selected number of examples, K, nearest to the predicting point.

For a classification problem, the label becomes the majority vote of the nearest K points. For a regression problem, the label becomes the average of the nearest K points.

 

* = Source Credit : TowardsAI.net


Tuesday, December 15, 2020

Chapli Kabab

Chapli Kabab is my favourite Kabab after Bihari Kabab. I had Chapli Kabab recently at my friends place - Sohail Bhai's. I was dropping him something at his door and he being his usual hospitable and gracious - offered me something he had made - I went after dinner but it was Sohail bhai's creation - couldn't say no to it (and I am bit of a glutton anyways lol). He shared the recipe with me. This past weekend when my parents were vistitng me I got excited and decided to excite them by attempting to make Chapli Kabab.



Recipe Process:

- 1/2 kg of minced beef (25% fat)
- 2 tblsp of oil and whisk 2 eggs - fry both sides
- 3 medium sized red onions - finely chopped and remove excess water
- 10-12 desi green chillis - finely chopped
- Mix the green chillis and onions in the minced beef (qeema)
- + 1 tblspn of ginger + garlic paste
- + 2 cup of freshly chopped coriander
- + 1.5 tblsp of crushed coriander seeds
- + 1 tblspn red chilli flakes
- + 1 tsp cumin powder
- + 1 tsp garam masala
- + 1.5 tsp Ajwain crushed (carom seeds)
- + 1.5 tblsp anardana crushed
- + 0.5 tsp black pepper powder
- Mix everything
- + 1 cup of maize flour - mix it
- +1 raw egg - mix it
- + 2 chopped de-seeded tomatoes
- Refrigerate it for at least 30 mins.

- Heat the oil and fry each side 7-8 mins.

My parents and family really enjoyed it and I surprised myself (again) . It was lot of work but it paid off. Alhamdulilah!






Thursday, October 22, 2020

Data Science : Chapter 2

Wanted the title to be "Data Science: Reloaded" or "Data Science 2.0' but "Chapter 2" gives it a growth-mind-set connotation instead of a dramatic (read cheesy lol) title 

I was introduced to this 6-month Data Science certificate program by my ex-colleague Noman Bhai. It was an instant 'yes' as back of my mind I always wanted to reinforce my past learning about Data Science. I completed my Masters in Management Analytics from Queen's Univ and it has been 5 years since then.

This program uses Python which was another driver in plunging into this certification. I am halfway done (almost end of Module 3) and must say it has been a great journey. Learning a lot from Zoom class sessions, met some dynamic members from cohort from different backgrounds. It has given me lot of exposure to the statistical modeling and make me appreciate it even more with the introduction of Python. 

Young and enthusiast faculty. The passion and desire in explaining Data Science and stats concept is evident. Most of them are pursuing Masters or PhDs; delivering gems in professional and respectful manner.  

Presentations complimented with lab sessions to reinforce the concepts. TA always accessible for any questions. Each module has sporadic quizzes. Every module has written (virtual) and practical (take-away) exam - honestly not my strong forte studying for exams (anymore lol).

Nevertheless, I have something to look forward to on weekends. It is painful to get up early on weekends as weekends are the only 'no alarm' sleeps - but not anymore. Hey - I ain't complaining. I did mention I am enjoying them - learning is fun (sometimes) :)

The biggest 'surprise' (as so may call it) was my introduction to Konain Qurban. In hindsight, embarking on this certification seems like an excuse as destiny wanted me to meet him. Younger than me but an enormous wealth of experience, information and wisdom (had to say that part lol).  He is a trail blazer - he is the founder of this institute - Frontier Institute of Technology. A visionary. If a title exists, he is clearly the front runner for "30 under 30" in Pakistan !

On a lighter side, he also introduced me and family to the best Hyderabadi restaurant in Canada! Will go again when you are back in December !

Almost forgot, we also must do a Capstone Project. I have been assigned a group - looking forward to this sub-journey.

On a serious note, I should start my Assignment for my Module 3. One thing to add,  from work front - it is so damn busy as Covid has shamelessly erased the work-home boundaries, end up working much more and feeling burnt-down as well. I can write more but will hold that thought for later post (plus need to stop procrastinating and get on with the assignment! ) 

 

*Edits. Captured my social media 'fame' lol :