@ih answered your original question on calculating the statistic.
The following tries to answer your follow-up questions.
You pretty much understand what it does. "ChiSquare" and "Prob>ChiSq" tells the result of comparing the estimate with zero. Here the probability 0.0409. It is calculated as the probably that a Chi-Sqaure random variable with 1 degree of freedom is larger than the test statistic 4.18. Loosely speaking, it says that if everything goes well (e.g. model is correct), and if you conclude that intercept is significant, you could be wrong with a small probability 0.0409 (Type I error).
Please check out literature on hypothesis testing, Type I and Type II errors. Read linear regression would be a good start.
In linear regression, we use "t-Ratio". According to linear model theory, the "t-Ratio" statistic should follow a Student-t distribution. So if the "t-Ratio" statistic is too small (negative to the left) or too big "positive to the right", it implies that one would less likely be wrong if one concludes that the parameter is significant.
Here in logistic regression, we don't have a "t-Ratio" statistic. The theory gives us "ChiSquare". And an extreme "ChiSquare" (extreme to the right) statistic implies that one would less likely be wrong if one concludes that the parameter is significant.