Posts

Python Automation for Group by and Line Chart

  def   conversion_rate ( dataframe ,   column_names ):      # Total number of converted users      column_conv  =  dataframe [ dataframe [ 'converted' ]  ==  True ] \                        . groupby ( column_names )[ 'user_id' ] . nunique ()      # Total number users      column_total  =  dataframe . groupby ( column_names )[ 'user_id' ] . nunique ()               # Conversion rate       conversion_rate  =  column_conv / column_total           # Fill missing values with 0      conversion_rate  =  conversion_rate . fill...

A/B test using Python

Calculating KPIs You're now going to take what you've learned and work through calculating a KPI yourself. Specifically, you'll calculate the average amount paid per purchase within a user's first 28 days using the  purchase_data  DataFrame from before. This KPI can provide a sense of the popularity of different in-app purchase price points to users within their first month. max_purchase_date  =  current_date  -  timedelta ( days = 28 ) # Filter to only include users who registered before our max date purchase_data_filt  =  purchase_data [ purchase_data . reg_date  <  max_purchase_date ] # Filter to contain only purchases within the first 28 days of registration purchase_data_filt  =  purchase_data_filt [( purchase_data_filt . date  <=            ...

SQL triggers

  -- Create the trigger CREATE   TRIGGER   TrackRetiredProducts ON   Products AFTER   DELETE AS      INSERT   INTO   RetiredProducts   ( Product ,   Measure )      SELECT   Product ,   Measure      FROM   deleted ; Insted of Trigger The company doesn't want regular users to add discounts. Only the Sales Manager should be able to do that. To prevent such changes, you need to create a new trigger called  PreventNewDiscounts . The trigger should be attached to the  Discounts  table and prevent new rows from being added to the table. -- Create a new trigger Create   Trigger   PreventNewDiscounts ON   Discounts INSTEAD   OF   INSERT AS      RAISERROR   ( 'You are not allowed to add discounts for existing customers.   ...

Twitter API

  API Authentication The package  tweepy  is great at handling all the Twitter API OAuth Authentication details for you. All you need to do is pass it your authentication credentials. In this interactive exercise, we have created some mock authentication credentials (if you wanted to replicate this at home, you would need to create a  Twitter App  as Hugo detailed in the video). Your task is to pass these credentials to tweepy's OAuth handler. Import the package  tweepy . Pass the parameters  consumer_key  and  consumer_secret  to the function  tweepy.OAuthHandler() . Complete the passing of OAuth credentials to the OAuth handler  auth  by applying to it the method  set_access_token() , along with arguments  access_token  and  access_token_secret import   tweepy # Store OAuth authentication credentials in relevant variables access_token  =  "1092294848-a...

Web Scraping using JSON and API

JSON–from the web to Python Wow, congrats! You've just queried your first API programmatically in Python and printed the text of the response to the shell. However, as you know, your response is actually a JSON, so you can do one step better and decode the JSON. You can then print the key-value pairs of the resulting dictionary. That's what you're going to do now! Pass the variable  url  to the  requests.get()  function in order to send the relevant request and catch the response, assigning the resultant response message to the variable  r . Apply the  json()  method to the response object  r  and store the resulting dictionary in the variable  json_data . # Import package import   requests # Assign URL to variable: url url  =  'http://www.omdbapi.com/?apikey=72bc447a&t=social+network' # Package the request, send the request and catch the...

Webscrapping using BeautifulSoup

  Assign the URL of interest to the variable  url . Package the request to the URL, send the request and catch the response with a single function  requests.get() , assigning the response to the variable  r . Use the  text  attribute of the object  r  to return the HTML of the webpage as a string; store the result in a variable  html_doc . Create a BeautifulSoup object  soup  from the resulting HTML using the function  BeautifulSoup() . Use the method  prettify()  on  soup  and assign the result to  pretty_soup . # Import packages import   requests from   bs4   import   BeautifulSoup # Specify url: url url  =  'https://www.python.org/~guido/' # Package the request, send the request and catch the response: r r  =  requests . get ( url ) # Extracts the response as...

Web Scrapping using request

  Now that you know the basics behind HTTP GET requests, it's time to perform some of your own. In this interactive exercise, you will ping our very own DataCamp servers to perform a GET request to extract information from the first coding exercise of this course,  "https://campus.datacamp.com/courses/1606/4135?ex=2" . Import the functions  urlopen  and  Request  from the subpackage  urllib.request . Package the request to the url  "https://campus.datacamp.com/courses/1606/4135?ex=2"  using the function  Request()  and assign it to  request . Send the request and catch the response in the variable  response  with the function  urlopen() . Run the rest of the code to see the datatype of  response  and to close the connection! Code -  # Import packages from   urllib . request   import   urlopen , Request # Specify the url url  =  "https://campus.datac...