My only consideration : should I make one function taking radius and height, which returned two values, Area and Volume, or two functions, one returning Area and one returning Volume.
I chose two functions for function clarity and simplicity ( to my C programmer's mind )
round ( CylinderArea (radius,height), 2 )
The ipnb files are Jupyter Notebooks files. They are the course work files seen in the following videos that you can replicate on your laptop in a Browser or by installing Jupyter labs on your local machine : HOW?
To try quickly in your browser
goto : https://jupyter.org/try
A new page will open : https://hub.mybinder.turing.ac.uk/....
Five crucial elements
1. Azure identity services : Let the right people in, the wrong ones out
2. Security tools and features : Security is vital
3. Privacy, compliance and data protection standards : Best practice
4. Secure network : Essential though most N/W traffic now secure
5. Monitoring and reporting : To see what is and isn't working
GDPR was / is a EU directive with the UK now retaining the legislation and in many cases adopting a more rigorous approach. I don't think anyone wants to go near trying to unpick it.
To untangle data and geo locate it for many international companies is well nigh impossible, and who would / what authority in reality would / could try to audit that, and...
I note that products such as Solarwinds and Zabbix monitoring instances can be built on Azure as 3rd party monitoring.
These can report on bandwidth, memory, cpu usage, hard disk capacity, load balancing issues etc, vital useful on high load or high traffic peak, real time systems.
The use by default of password complexity rules, coupled with the widespread adoption of password managers ( Lastpass/Keepass/Browser ) has reduced the compromise of accounts significantly. Limiting access to accounts by the use of IP whitelist rules also reduces the compromise surface. That is not to say it still doesn't happen, just that it shouldn't in...
30 Nov 21 : got it working after some time
Install the CLI on Windows and Linux ( I chose Ubuntu CLI )
I chose : Option 1: Install with one command
then to run the program
$ sudo az login
You will be asked to navigate to
The term Container in software and IT is a little nebulous, but they are essentially a ( ideally single ) software process running / depending / utilising a larger operating system superstructure.
Thus multiple containers can run on a single Windows/Linux machine ( eg see Docker ) Much like individual shipping containers...
Top left of Azure control panel screen
Home>VMName>Networking>Add inbound port rule >
Source : Any
Source Port : * (a http request can come from a large range of ports )
Destination : Any ( We can send traffic to any public IP Address )
Service : HTTP : Port 80
Priority : default ( 330
Name : Port_HTTP
The ( currently ) single Azure data centre for the whole continent of Africa is located in South Africa. You would be right to think that some provision should be held off the west coast catering to Ghana,Nigeria, Cameroon and Gabon for example. Currently no.
I assume there just isn't the demand,that current West African country network infrastructure...
Samuel, you raise an excellent point where dedicated machines in your industry, healthcare, manufacturing etc have high functionality, working, stable software, often running on legacy operating systems ( windows / embedded linux etc ) that can't be put 'in the cloud'.
The division of visual diagrams into 4 key categories,
is very useful as an immediate guide on which type of diagram is optimal to display the data and conclusions you wish to draw from the data. thank you
The Promo of 50% doesn't work, or is way to generous. Promo days resulted in similar sales quantity ( coffees served / day ) as other days but half the revenue, they didn't increase number of coffees sold ( significantly )
Shop doesn't need 8 staff, especially Mon-Wed
Little correlation between Avg Temperate and Hot/Cold coffee sales
Busiest days are Friday...
select customer.firstname,customer.lastname,sum(invoice.total) from customer
inner join Invoice
where customer.customerid = invoice.customerid
and ( customer.customerid = 1
or customer.customerid = 6
or customer.customerid = 59
group by customer.customerid
order by -sum(invoice.total); /* The negative for descending order...
Just spent a really engaging few days ( about 5 ) getting to grips with Sqlite, filters, code and the sheer power of it. I am very impressed while realising I am just scratching the surface of the power of this tool.
I am fortunate that I code ( C / Python ) also so can see parallels / different ways of achieving tasks, and do appreciate that SQL provides...
So a personal learning experience. I spent about 4/5 days late October writing some Python code to filter the flights.csv file and was successful
Now I have spent about 4 days learning SQLite and have two identical CSV files of filtered departures for LAX, days 5 ( time >= 1700 ) /6/ 7 ( time < 1200 )
I have learnt a lot about big files and the power of...
select firstname,lastname from customer where customerid = 42;
customer id is an INTEGER, match can be with integer, no quotes required
select employeeid,firstname,lastname from employee where address like '%77%';
address is NVARCHAR(70), thus LIKE search term ( 77 ) is in quotes
sqlite> select count(customerid) from customer where postalcode is...
A potential for confusion is that the download from github gives two sqlite files, a .sql and a .sqlite.
The chinook_sqlite.sql is human readable using a text editor ( sublime text etc ) . The .sql file is in effect a .dump of the database file
In SQLite you can read the sql or open the sqlite
sqlite> .read Chinook_Sqlite.sql
@AndrewB @BolajiAhmed :
what extent this actually happens in the 'real world'.
Amost never : data would be added directly to the SQL DB, only by DB maintenance or analysis IT devs manually in this fashion.
The majority of manual data entry into is done via Web interfaces which allow / provide software data error, validity and form checking ( formats of...
sqlite> .mode csv
sqlite> .import flights.csv flights
sqlite> select count(year) from flights;
Cool : took a couple of days working around SQLite to get to this point, still getting the hang of adding a ; to the end of NON DOT commands, and importantly NOT adding a Semicolon to the end of DOT commands as you end up with...
Why would it make more sense to use a separate integer value for a primary key
Format constraints for PK and error checking capability prior to Query of PK
Data security : PK not based on P.I.I. ( eg name / DOB )..GDPR
Immutable : The key never has to change, other aspects of the record can be updated..name...address..phone....etc
Zero potential for...
There is a LOT of data out there to collect, with the 'data' and the complexity of it evolving all the time.
More complex models will be required to capture and link related data together ( I guess ! ) to allow relationships between data points to be found, mapped and information derived.
Scatter plot for this data is pure noise with zero correlation between height to weight, zero, even when boxed off to 5/10 cm intervals.
This can't be a data set for sports anything, and unlikely Olympian women who would typically come in around 55kg for a 165cm female athlete ( eg Laura Kenny ).
Even if this data was for a specialised sport that is...
I can see that with numbers as easy to read as a 24 hour time, written in 2/3 digit numbers 730, not 0730, or a month in text ( January, February etc ) converting this into numbers which are easily sorted can be hard in excel.
Also calculation in base 7/12/50 or 24 ( 0830 - 0930 is not 100 but 60 minutes ) for example
I have 213 flights departure LAX all days 1-7
88 flights departure days 5/6/7 LAX
47 flights LAX between
Day 5 departure after 1700 ( Tail N8654B )
Day 7 before 1200 midday ( tail N364AA)
sorted by departure time
Wow, that was a learning experience
=VALUE(RIGHT(B2,4)) for the year in numbers
=LEFT(B2, SEARCH(" ",B2)) for the Month in Text
=LEFT(B2,SEARCH(",",B2)-1) gave me eg April 23 from April23, 1983
=VALUE(RIGHT(C2,(LEN(C2)- SEARCH(" ",C2) ))) gave me 23 from April 23
9 from July 9
As you had to calculate the length of string between the Space and the end...
They made the same mistake in the last course confusing Variance Population with Variance sample
Variance Population = 240 / 9 = 26.66 ( we all agree )
Sum of (Mean - vale ) ^ 2 = 240
n Population = 9
Variance sample which has been quoted as the answer
= 240 / ( Population- 1 ) = 240 / 8 = 30
I think we are looking for patterns here and outliers. Is there a destination, airline or time of day that is 'problematic' with extra delayed or cancelled flights?
An hour lost on this : is it me or is there an error and confusion regarding the sample set and quoted Variance calculations
Sum = 867
Mean = 57.8
Sum of square ( xi - 57.8 ) ^2 = 7686.4
n = 15
n-1 = 14
7686.4 / 15 = 512.43 ( Variance : Population )
7686.4 / 14 = 549.03 ( Variance : Sample...
I think measure of Time in general is one of those special categories, where time itself is continuous, we break it down into seconds, minutes, hours...decades etc. It is also determinate, 1700 will come an hour after 1600 as Wednesday comes after Tuesday.
We do represent time in numbers, discreet blocks and we can quantitatively measure a period of time,...
Companies like McDonald's, Pizza Express, Starbucks etc must have heaps of data on this type of thing where the restaurants are as identical as they can be in terms of customer offering, same menu, same ingredients etc, but, also differ a great deal in terms of location, staffing size of premises, and from a very simple, top down, business point of view,...
Focus on : flights departing after 5pm on a Friday and before 12pm (midday) on Sunday, DEPARTING LAX
This will allow the flights.csv file to reduce in size significantly, from > 5 million rows to a few thousand.
I'm Just not quite sure how to do that!
Update 43309 lines I think, courtesy of a couple of days of python coding
DDDM, by definition is based on analysing data that exists, has ben collected, structured and thus has already aged to some degree, an hour, a day, month or year etc.
Decisions of their nature determine the future, and while historic insight is important ( how did we get to here ), it might not provide insight into 'where are we going?'.
There is a...
The file flights.csv is huge, 5,819,080 rows . Day 31, month 12 starting at line 5,805,948, representing ~13,000 rows of data / day!
You can use Sublime Text to view the file in raw text. Libreoffice Calc / Excel has a row limit of 1,048,576 rows.
Can the use of such a system justify the application of the right not to be subjected to automated individual decision-making?
The question is necessarily complicated by the 'NOT'
To rephrase ( if only for my own clarity )
Should a person have decisions made, that could affect their future, by a computer / algorithm, based on data they have uploaded to...
This area of law, morals and public debate is still under development.
A key aspect of the Mario Costeja González was that the argument that the original article in the newspaper should be retrospectively redacted was thrown out.
It is still a matter of public record, in the newspaper that Mario Costeja González was subject to court proceedings, it...
Data that is difficult or impossible to change that could, if put together enable identity theft ( DOB / Place of Birth / Current complete address / other biometrics that cannot ever be changed by the Data subject ( me/you ) )
Place of Birth
Extrapolated information of other family members
The data owner of the opinion would be the author of the opinion, not the subject of the opinion.
Other areas of law cover subsequent ( inadvertent or requested ) release of this data, ( libel/ slander/ discrimination ) etc.
The data ( Employer opinions ) is private and confidential and cannot be released, companies or individuals could be sued if such...
All browsers offer the ability to
1 delete cookies,
2 to block cookies
3 Use extensions like uBlock to limit exposure
4 More usefully, only allow cookies for the 'session'.
Why would you choose 4.. Sites need to know that you have logged in and authenticated, banking and financial sites, but it is sensible for you to tell your browser to forget this...
Occasionally, its representatives tell visiting parents that personal data of their children, including names, home addresses and genders, are processed.
The parents should as best practice be informed automatically when initially submitting data, the scope of all future data processing purposes of their children.
In the event that that...
Many of the larger organisations the Googles / FBs etc have an automated process to release your data, reducing the very real cost of processing.
The cost of processing and release of data to subjects by data holders can of course be reduced by companies, other entities, by ensuring that they hold very little data on data subjects, or even purging PII data...
Prior to the internet, the concept of data privacy, PII, identity theft was the stuff of obscure spy novels.
Identity theft, loss of financial reputation, is common place and costing the public millions a year.
Companies have till recently been extremely lax about storing customer PII, considering this database a free resource for them to use...