You can tune into this call live every Tuesday.
Visit acb.community to learn more about what is up coming.
The following transcript was auto generated:
2022-12-13 – Unmute Presents
The Your Own Pay Podcast Network
Unmute Presents
[0:01] This is an ACB Community Call presented by American Council of the Blind.
[0:11] Music.
[0:18] This Unmute Presents Education Series was hosted on Tuesday, December 13th, 2022. All right, welcome everyone to our first education event,
photography.
So we’re gonna be doing that today. I just wanted to remind everyone here at the top, we’re gonna follow the same rules when it comes to questions, make sure everyone gets their first question answered.
And then if we get through everyone, we’ll take second questions.
If you have any comments, questions, or you need to get ahold of us, You can email us at unmutepresents at gmail.com.
Michael, how’s it going? What do you got for us today?
[1:06] Mark Nielsen Oh, see? That’s what happens when we double mute ourselves! Anyways, it’s going well, Marty. Thanks for asking.
So, as Marty said, we have something a little bit different this time. I believe we’ll have an opportunity for questions and answers to be asked. However, I also have some new announcements to make.
So firstly, let’s get the elephant in the room out of the way. We have been podcasting a lot more content now.
So we’ve almost tripled the amount of content that’s coming out in the podcast feed. On last Thursday, we had a podcast that dropped between Marty and myself and it was about
26 minutes long talking about backing up and being able to backup your content and make sure that you are secure as well as backup power solutions.
In addition on Sunday I published a short form piece of content that’s talking about how to subscribe to the Unmute Presents Podcast.
I know.
[2:12] If you don’t know how to subscribe to the Unmute Presents podcast, how are you going to hear the podcast to help you subscribe to the Unmute Presents podcast? Anyways, I just wanted to throw that out there for people who are interested in overcast.
My thought was you may be subscribed somewhere else, so maybe you’re interested in how to do it in overcast. Lastly, very important, after some requests from individuals and some desires,
I can confirm that I have a representative from Raz Mobility, the company that makes the MiniVision and SmartVision phones, coming to speak at Unmute on January 10th.
So we’ll remind people, especially those who are regular visitors to the Unmute Presents call, that January 10th at 10 a.m. Pacific time, so our regular time, that’s 1 p.m.
Raz Mobility will come and demonstrate slash talk about the mini vision line of products and,
Sounds like they’ll be shipping me one to test out. So hopefully I’ll have some more feedback on that So that’s the announcements I have remember that this podcast will be podcast This call will be podcasted,
Shortly after the end of it and with that we have a special person here So Michael is going to hand it off to Michael to introduce himself and talk about,
Photography from a visually impaired individuals perspective. Hey Michael.
[3:35] Hi everyone. Thank you guys so much for having me. Thanks Marty and Michael It is, you know an honor and a pleasure to be here with you guys once again if you guys remember I,
Was with you guys before when we discussed way around the tagging and scanning solution,
for your iPhone and Android so if I’m sure You can go back and listen to that, but I’m back again on a multi-part series.
And this is a very…
[4:05] Close topic to me because I feel like I’m a photography person. I love taking photos. While I was a teacher at the Chris Cole Rehabilitation Center, I did a photography class occasionally. So I figured it would be neat for us to do this in a podcast form,
and where we can do it over several sessions. It won’t be every week, but we will do this occasionally.
To talk about photography. And, you know, a lot of people think, okay, I have an iPhone, it’s got a camera on it, I can use it to read things or use it with sighted assistance to.
[4:50] See something or things like that. And most people think I don’t want to take pictures because I can’t see them. Well, the truth of the matter is that there are a lot of ways that you you can enjoy photography or you could use it to share with your family.
And today we’re going to keep it simple.
We’ll go into more of the apps in a later session.
We’ll talk about a few things today, but the biggest thing that I want to talk about is the fundamentals and I may open up the camera app with voiceover a little bit just to show that off.
But the biggest thing, the most important thing, is to know that you can use the camera on your phone.
[5:41] To take good pictures. And it doesn’t matter if you have a little bit of vision, a lot of vision, no vision, it really makes no difference.
We can all take great pictures with our phones. And that’s why we’re here today.
And so one of the things that I want to talk about is, first off, how does, you know,
If you’re on an iPhone, and this is kind of geared towards the iPhone, and maybe eventually we’ll come back and talk about Android.
On the iPhone, we have our cameras. And most people think, I’ve worked with so many people that are blind or low vision,
they think that the cameras on the back of their phone is just at the center of the back of their phone. And so a lot of people have difficulties in knowing where to hold their phone. You know,
every phone has a little bit different camera placements and like the Samsung phones have,
I call it the camera stoplight on the S21 because it has three camera lenses, one above the other and above the other. And the iPhone has cameras that are, you know, there’s three of them on the the Pro models too on the others.
[7:04] But the nice thing about these phones is they’re all in one place. They’re all centered around one area on the phone. Now on the Pixel lineup, they’re in different places. Lining up your cameras is pivotal.
It’s key to taking a good picture.
The way that our eyes work is that we see out of our eyes straight ahead, and then we have an amount that we could see on the sides. Your camera works the same way.
So it’s kind of like you have your central vision in your peripheral vision.
Right? So as light comes into the camera, it sees things just very similar to how our eyes work. That’s how we take videos and that’s how, you know, all these things work.
[7:54] So when we take a picture, we’re seeing what’s, you know, behind our phone, like, as if they are eyes. So the one thing that happens is some people will hold their phone as if they’re looking at the,
screen or as if they’re, you know, in all different ways. And the biggest thing that you that I like to say is when you’re when you’re looking when you’re trying to take up take a good picture.
[8:24] You need to make have your phone you need to hold it parallel to what you’re trying to take.
So if we’re out doing O&M and we’re out getting trained on, you know, we’re crossing a street, you have parallel traffic that is in front of you. You hear the cars in front of your face.
You want to hold your phone, you know, in that kind of an angle where it’s up and down, so where the lightning port is pointed towards the ground and the volume buttons are pointed
to the left in your Siri button and things like that are pointed to the right.
And that way, if you have the cameras pointed away from you, then you’re going to take the better shot.
Now, if, you know, you always change that based on how you’re going to be, you know, facing like if you have something on a desk, then you want to have the phone parallel to the desk.
And then have it a few feet away. You know, a lot of times other people think, you know, I need to put the camera right up to the thing that I’m trying to take a picture of.
And that doesn’t work either.
You know, a lot of folks, including myself, are very used to the iPhone’s camera app.
You know, I happen to have a face right here, and I’m going to use it.
[9:52] So I’m going to turn on VoiceOver.
[9:56] VoiceOver said nice answer. Boston, 74 degrees Fahrenheit, mostly cloudy. I’m on my home screen.
One of the nice things about VoiceOver is it will give you information about your environment when you’re using the camera.
I’m going to open the camera app.
Camera, double-tap to open.
[10:19] Camera, take picture button. It automatically focuses me on the take picture button. But what I want to do is I’m going to flick left until I hear the viewfinder.
Camera mode. Zoom. One point camera control status.
Photographic styles. Camera controls. Viewfinder. Focus unlock. Image. Okay. So it’s… Don’t tap to focus. Computer keyboard.
So it’s already telling what is in, you know, it’s seeing, which is kind of neat, it’s describing things. So I’m going to look to my left.
[10:54] And voiceover’s not doing it. Viewfinder, focus unlocked. Image, double tap to focus.
Adult, printer, sofa, wood processed.
Nomatic video photo portrait penny. 420, adult, sofa, wood processed. So it’s interesting that it’s not doing the one face that it used to do.
Mine’s doing it actually. So Taylor’s is doing one face.
[11:22] So that’s really interesting. There’s so the nice thing about the iPhone is it will tell you how many faces are in the picture.
And mine was just telling me that there’s an adult and what was in the picture.
So, and I was using the technique that I just explained.
Now, one of the things that you do have to do is, and that helps sometimes, is to slowly move your phone until you get the shot that you want.
Moving your phone quickly is not going to get, you won’t get all the details. It takes you a second to focus.
But then you would wanna find that take picture button and then you could double tap on that to take a picture.
Now, one of the big questions people have is, okay, I took a picture.
And we’ll talk about how to work with pictures in another lesson. But one of the things that I want to talk about is the concept of a live photo.
And what that is, is once you take a picture…
[12:33] Then it actually takes a second and a half before and a second and a half after the picture is taken.
And so you could double tap and hold on the picture in your photo library.
[12:46] And it will play a small little clip of what was happening when you took that picture.
So, you know, some people they even gotten well good enough where you can label a picture by talking right before and right after or during the taking of the picture.
Very difficult because you only have three seconds. But you can do it.
It’s kind of neat. So for example, one of the things that I enjoy is I love rivers and water and I love taking photos of that, right?
[13:22] But, you know, it doesn’t matter much if you can’t see the picture. So one of the things that I enjoy doing is, you know, if there’s, if like I’m taking a picture
of a river and there’s a ship going by, you know, you may get a picture while you could hear the motor of the ship or horn or different things that identify what the picture is. And it’s really kind of neat. You could have people talking in the background, you could have, you know, waterfalls,
or you could have different things. It’s like, oh, you know, that audible cue reminds me that I was at this location when I took this.
And you could then say, oh, hey, check out this neat picture I took at this aquarium or this place just based on the audio that was in the image on the live photo.
So that’s why live photos is great. The camera app really is an amazing app for what it does.
And it’s only one app on the phone.
[14:30] But one of the, going back to the concepts, we talked about how to position the camera and how to hold your camera.
But one of the things that is really important is holding your phone still.
And the reason is that the more you hold your phone still, the better the image will be.
The iPhone has, unlike a lot of cameras like DSLRs or big bulky cameras, your iPhone has a very big brain.
And it can go in and say, oh, well, there’s blurriness in this part of the image.
There’s dimness here. There’s all these things. And so it can actually enhance during the taking of the picture to make your picture look great.
[15:23] Now, it also has settings where you can turn all of that off and do what’s called shooting raw.
And that means that the picture that you take is just the data, like the light that comes into the camera.
So the camera itself is called a sensor. The lens and the sensor all work together to take in light and translate it into something that our phones can understand.
And we use a sensor in our cameras to do that. So one of the questions you may ask is, why does our phones need to have three cameras?
Or I like to call the Samsung Galaxy S22 Ultra the incomplete braille cell.
And that’s because it has five camera lenses.
[16:21] And it actually has dots one, two, and three, and then dots four and five of the braille cell.
So there’s five lenses on that thing. And it really does like, and I think the flash is in position six. So, I mean, you could argue that that’s a full braille cell.
So, you know, It just gets crazy when you look at all of those cameras.
[16:52] And each one has a different purpose. The iPhone Pro models have a wide angle lens, they have a regular and a telephoto lens.
And all those, you don’t need to know exactly what all those mean, but like the wide angle makes it to where you could look, take pictures of a wider shot.
[17:15] And each of them has different clarity values and all of that, you know, you could look up online, but different lenses are better for certain things than others.
But what it allows you to do is now they have software built in where you can take a picture with one lens and you say, oh, I didn’t get exactly what I wanted in that one shot.
So you could zoom out after taking the photo and say, ah, that has it in there.
So I’m gonna use that version of the picture. So there’s a lot of image editing things you can do, you know, with an image that having one camera doesn’t let you do.
The other thing that it lets you do is zoom in further.
So if you’re using OCR, it can do those things. And you know.
[18:06] That’s why we have all these cameras on our devices is so that we can take advantage of different lenses different light levels and things like that. For example in the things change all the time certain lenses do not support night mode and what that lets you do is if you have something that it’s really dark outside you can take a picture and it will.
Make it look really good at night.
And the interesting thing about that is, is that it’s Google Pixel does it, iPhone does it. I’m not sure about Samsung if they have that built in or not.
But it just makes shots look great in low light.
Each one does it a little different, but it all makes pictures look great instead of being blurry. Because that’s always been, you know, if it’s really dark outside, it’s hard to make a shot look great.
[19:04] So, you know, those are some basic camera fundamentals. We have our, you know, our basic, you know, there’s a lot of terms like aperture size,
which means, you know, the size of the lens or light levels can come in.
We have, you know, different stats there. But the biggest thing to really think about with cameras is what is going to be the best camera that you can use at the time.
I own a Canon Rebel T7i, which is a really nice, well, it’s not really nice, but it’s a good digital camera that has detachable lenses. And that’s great. And I love that camera.
It has a touchscreen and all those things. completely inaccessible, of course.
[20:00] But you can do some great shots with it. The problem with that camera is it’s a huge camera that needs a camera bag and you have to lug it around if you want to take it somewhere.
With your iPhone, you have an accessible, an affordable camera that you don’t have to buy anything else with.
The iPhone camera, Apple does these shot on iPhone ads and things like that, where they show all these pictures or videos and things like that.
[20:35] So there’s so many ways that you can use your iPhone with that.
The other thing I wanna talk about is the difference between portrait and landscape.
And if you’re not aware, portrait mode is where you’re holding your phone with the power button,
like we talked about, pointing down to the ground and volume to the left and the side button to the right.
Now, if you…
[21:05] If you’re holding it like that then it’s in portrait and think of a printed piece of paper eight and a half by eleven Piece of paper that is a portrait page if you’re holding it, you know As we would normally print it off the printer,
If you turn that page to landscape or on its side Then what you’re doing is you’re you’re holding it in a different viewing angle, and we do that with our phones.
We can turn our phones to landscape to watch a movie or take a video or any number of things there.
So you could take a picture or a video in portrait or landscape and a picture looks great in portrait.
[21:53] But you can get more to the left and right in landscape. So it just depends on, you know, if you have a family photo, you may want to use landscape. If you’re taking a selfie, you know, you may want to use portrait.
[22:10] So these are all of the different things that you can use, you know, terms and ways of doing things where you can hold your phone,
where you can make your photos look great.
[22:28] So, I guess at this point, I would like to open it up for any questions because, you know, I could go on about different topics, but I’d like to know what people are interested in learning about photography.
So before we get to those questions, Michael, thank you so much for coming and explaining a lot of these.
I have a comment that I want to make and a question for you. So the comment that I want to make is, and I think this is kind of interesting, while
he was talking and demonstrating the camera on iOS I have my pixel 6 Pro here that I was playing with because I’m like I wonder if I can get some of that
feedback from the pixel 6 Pro and you can one of the cool things is is it’ll tell you one face 30% of screen great for selfie is the feedback that the pixel when you’re using your front-facing or back-facing camera Again, this is on the Pixel 6 Pro with Android version 13,
What but but on the other hand it does not give you?
[23:29] Environmental feedback such as the iPhone does where it tells you that you’re pointing it at a computer or whatnot So, you know give and take with with both of these devices There are tools on Android to be able to make that happen,
my question for you though Michael is when I,
I go and I look at cameras and I look at phones and it says this camera has a 36 megapixel camera and this camera has 120 megapixel camera.
Does bigger megapixel numbers always mean better?
[24:04] Great question. And the answer is no, bigger does not mean better. With megapixels, most iPhones have always done pictures I’ve done pictures at 12 megapixel.
And even with the iPhone 14 Pro lineup, they increased the sensor to 48 megapixels, I believe.
And what happens is when you have it set to 48.
[24:31] It’s going to take a picture, use its artificial intelligence, and take all of that great information and scale it back down to 12 megapixel, because those 48 megapixel images are huge.
They take up a lot of data, and they’re big.
Now, what you can do, and we talked about this earlier, you can set your settings to RAW, and you can get that full 48 megapixel image, but it won’t be edited or no AI work will be done to it.
You could do all of that as production on your own, but that just means that there’s more work you have to do to it.
But you can get that full 48-megapixel image.
The sensor is slightly bigger on the 48-megapixel compared to the others. So you’ll notice the camera bumps are bigger on the 14 Pro line.
[25:28] And so you do get more information than a 12 megapixel, but you may not get as good of an image in that space. So there’s aperture size, light level.
[25:45] All of those things are keys to making a good image, not just your megapixel size. Kind of like on your computer when you say, well, I’ve got 16 gigs of DDR2.
[25:59] I’ve got 8 gigs of DDR4. Which one’s better? Well, there’s a lot of factors that go into there. the speed of the memory can balance out the amount of memory.
So it just depends, you know, if there’s compared to and also comparing unified memory, like on the M1 chips to, you know, non unified memory, they all have their trade off.
So it’s again, kind of like music editing. What does what you’re looking for?
Certain lenses are better for night shots, certain images, like, I think the new iPhones, even their new lenses can take macro shots.
And macro means, you’d think it’d mean big, but what it means is, if I take a macro picture, I’m taking a picture of a butterfly or a caterpillar,
or a flower, and I’m making it huge so that I could see it in detail, even though it’s a tiny thing.
We’re making something small, bigger, so we’re macroing it. So there’s, you know, all of those things go into account when you’re looking at lenses.
And that’s why it’s better to have more lenses on a phone than to not.
And I love the camera on the iPhone SE. It’s a great camera, but it’s only one.
[27:20] There’s only one lens. So you only get the capabilities that you have on that one lens.
But even on the iPhone 13, for example, I went bowling, and I took both of these phones.
[27:32] I took the R3 phones because I’m that kind of person. The SE I had the 13 and the 14 Pro.
And I zoomed in on the scoreboard at the top of the bowling alley to see how well I could read it. I could barely read it with the SE. I could read it a little bit better with the,
14, I mean, the 13, and then I could read it extremely well with the 14 Pro.
[27:59] The last thing I’ll say on all of that is camera placement is key. Knowing where the line of sight with your cameras is important because one of the things that really just gets me on some of the
Android phones like the Pixel is you need to know where you’re going to aim your camera, right? And,
so when you have the camera bar, I call the Pixel’s camera area the camera bar because it goes from one side of the phone to the other and they spread those lenses out, you don’t really know where that,
lens is pointing compared to other lenses. So it could be the left part of the phone, the middle part, or the right part of the phone. Whereas on the iPhones or the Samsungs,
it’s all in that one upper right area. So those are things to look at too when you’re.
[28:51] Buying a phone is where’s the camera? Kind of like on the iPads. Most of them, when you’re using the camera is if it’s in landscape it’s off to the left and the new iPad
it’s in it’s on landscape on in the center so there’s just so many
for camera placements, it’s good to just Google, I have an iPad Pro, where’s my camera? That sort of thing. So Michael, I wanted to jump in, I had a question for you as well. A lot of people are
really interested in being able to capture text with their phone or their iPad. So maybe you can talk a little bit about the best way to capture text, what is OCR? And once you have the text
captured, where you would put it, whether that’s a stock app or any third party apps.
[29:45] So OCR stands for Optical Character Recognition. And the thing about OCR is it’s now built into the camera app and you can even use the Notes app. So I would start with the Notes app and there’s a,
you know, we could demo this on another podcast. But the neat thing about the Notes app, there is is a way to scan a document.
And it will take the text that is in there in a picture and put it in the Notes app.
But optical character recognition is basically OCR.
And what it does is it takes an image and uses artificial intelligence to figure out if there’s text in that image.
So of course, lighting and clarity and all of those aspects that make a good photo make a good text image for OCR.
This means that apps like Seeing AI, which is one of my favorites, and Envision AI, SuperSense AI,
all of those use AI to look at the text and figure out what is written there.
So when you scan a document, it’s basically taking a picture.
You.
[31:01] This is how OpenBook and others do it too, guys. They use a camera to take a picture or a scanner to scan the image, and then it looks at it and comes up with the text.
So you can read text with the camera app, but I highly recommend something like the Notes app or Seeing AI.
The nice thing about Seeing AI is it will try to format the text with headings and links,
and maybe not links, but colored text and bullets and lists and things like that, so that you can easily navigate it with voiceover,
which is really nice for scanning text.
For the best rules of thumb for scanning a document is, we talked about having your camera parallel with whatever you’re trying to capture. You want to do that with this as well.
You want to have it about a foot to a foot and a half away from the text you’re trying to capture.
The nice thing about Seeing AI is document mode and several others is that these apps will actually tell you if you have a page in view. So I highly recommend those.
The Notes app can do certain things. It’ll tell you text detected, things like that. But check out Seeing AI and we’ll go into OCR a little more in other lessons and talks.
[32:27] All right. I think we have a question, I believe, in Clubhouse. Chanel, do we want to see who’s got a question in Clubhouse? Well, we can ask Andrea for that. I don’t believe there are currently no hands in Zoom.
But Andrea, who do we have in Clubhouse?
[32:50] Let’s see. Nancy is now your… Oh, okay, Nancy. Yeah.
Then we may not have any questions. No, we do not have any questions in Clubhouse at the moment. Okay. And I’m just double-checking Zoom.
If anybody has any questions in Zoom, you can go ahead and raise your hands and we can take some questions.
Oh, all right. E.G., I think…
[33:19] Hi, E.G. Yeah, go ahead, Nanmoo. Hello. How are you doing? I’m doing fine, Marty and Michael.
Real quick question. The speaker mentioned three different programs, Scene AI and Vision AI, and I couldn’t quite catch the third one.
The other one was SuperSense AI, I believe. SuperSense AI? That’s correct. All right, I’ll check that out as well.
[33:44] Thank you. All right. And next up we have Nate and I do believe we have someone unmuted in Clubhouse. If you could mute until your turn, that’d be great. But next up is Nate.
Hey Nate, how you doing? I’m trying to get unmuted. Okay, quick question. Where would you recommend to go to find step by step directions to take photos with an iPhone?
Michael and Marty, you guys may know this, but there’s a book, right?
Is it through APH? It might be under the Take Control series. They put out a lot of books by the…
What were you going to say, Michael? There is an accessibility book that’s not through Take Control. If you stick around with us, Nate, for about 30 more seconds, I’ll find that book for you. I know exactly what you’re talking about there, Michael.
All right. Let’s go to Clubhouse. Oh, I’m sorry.
[34:40] You
We do have a hand in clubhouse now. Herbie. Hey Herbie.
Right. Hello. I want to go back to something you were talking about and I think I may have lost something. So you were saying that really, for instance, like we do not benefit from the higher megapixel cameras with the iPhone. Is that what you were saying or am I misunderstanding?
So we do benefit from the higher megapixel. The higher megapixel does not always mean a better picture is what I was getting at. But what happens is if you have it just set to,
the defaults, it still takes a 12 megapixel camera, but what it does is it takes that 48 megapixel that’s on the 14 Pro line and scales it down and uses the more information
that’s given in the 48 megapixel. And the AI is like, oh, I can make this better by using this data, I can make this better by using this data, and it makes a better picture
at 12 megapixel, partly to save space, but partly because a 48 megapixel image, if you were just to put like the resolution in like for printing or just for editing would be huge.
[35:57] So there’s, you know, we do benefit. It’s just typically it scales it down to 12 megapixel.
Alright, so one other question I have because this is always an issue and maybe you I don’t know if you have an answer for this or not, but when it comes to OCR apps like, you know, seeing AI and all that, do they benefit from, you know, like you say, because one of the things I’ve always thought about is do I want a 14 Pro because it’s going to have a better camera than just even the 13 and maybe see things better. Would you say that is the case or do you have an answer for that?
You know, it depends on, you know, the way I look at it is any better main camera, like the 48 megapixel that’s built in the 14 Pro, will definitely make those apps work better,
because they can take advantage of taking pictures in that.
You know, even if you don’t use the app, you could always take a picture in another app and import the picture into the app.
Because some of these apps do not have zoom capability. So like the zooming and being able to zoom in and out can really affect how well the scan the image is.
[37:14] But like if the app itself does not take advantage of the camera system, you could always use another app.
You know, even having like the camera app read the text out loud, you say, oh, that’s getting a really good shot of that.
Take the picture and then import it into seeing AI, and that will give you probably the best OCR.
[37:37] All right, perfect. Thanks, Nell. Who do we got next? Okay, we have two in Zoom and then I’ll double check in Clubhouse. First up is Ebrahim. Oh, wait, I think, hold on, Michael, did you have
something you wanted to jump in with? Yeah, I just wanted to follow up on Nate’s question before we go to you at Ebrahim. So there was a book produced and it’s available at National Braille Press by,
Judy Dixon. The name of the book is Capturing and Sharing the World, Taking Photos and Videos with an iPhone. So that may be helpful for you as well. Again, available at National Braille Press.
Thanks, Michael. All right, Chanel.
[38:13] Yes, Ibrahim, go ahead and ask your question, please. I wanted to know when positioning your camera to an object, say you want to take a picture of an object, is your phone centered to that object or are you centering your camera to the object?
Great question. If your camera is on the top right of my phone.
Great question. You want to center the camera. So that means that a lot of people will just try to center their iPhone with the picture, but with the object. But you want to think of it as the top,
right. And that’s why I always say, like the Google Pixel later phones, they have that whole bar, So it’s hard to know where to line up, but with the iPhones, you want to think of a,
make a straight line with like mentally, or if you need to do it, you know, with your hand, can I take your hand from your camera and go to the object that you want.
[39:15] Make it a straight line. And then, you know, usually if you’re taking a picture of an object, you know, about like the same for OCR about a foot away, we’ll get a good picture, a good shot.
I always get confused when I think it’s super sense tells you to put your phone in the middle of the page and then slowly left it up.
So I always put my phone in the middle of the page instead of where the camera should be in the middle of the page. If you get anything.
Well, the reason why SuperSense tells you to do that is once you get to a certain distance away from the page, the whole page will be in the camera viewfinder anyway.
So that will work for a page, but if you’re looking at other objects or people, that’s where it might differ.
Even with a page, if it’s an abnormal size of a page or different things, just use that other method as well.
[40:15] One thing I’ll jump in here really quick. If you have a hard time keeping your phone steady while you’re trying to scan text from a document, say on a piece of paper or something like that,
they make a plethora of different kinds of stands. A lot of them have a way to have your phone,
so the camera’s facing down and the screen’s facing up. Then you don’t have to hold the phone.
You can then have the phone facing down at the table, and then you can take the paper and move that around in front of the camera until it sets up and gives you the perfect spot to actually then grab and scan the text.
They make stands for iPads and iPhones in all different kinds of price points and stuff like that. It’s something to think about having if you have a hard time keeping your phone steady. That would help a lot.
Another thing is really… Oh, go ahead.
Sorry, I was just going to say that what also helps me keep my phone steady is instead of tapping the take a picture button, I use the volume up.
[41:22] Yes, exactly. That’s what I was about to say. Is that if you’re taking a picture and you want to hold your hand steady, double tapping is going to move your phone a lot.
And so pressing that volume up or volume down keys will take a picture pretty quick.
Another thing, and this is a little bit geeky, a little bit on the nerd side, but there is a…
[41:52] If you go into the Shortcuts app, if you go to the gallery, you can do a search for the shortcut called Say Cheese.
[42:03] And basically what this will do is you add it to your phone and you can tell S-Lady or S-I-R-I, Say Cheese.
And it will automatically, with anywhere on your phone, you don’t even have to be in the camera app, don’t just take a photo out of your camera.
[42:24] That is kind of cool. One thing I will confirm also for people who are curious about it, the volume keys do work with Pixel Now as well. They did not for a while. So now with the least again, the Pixel 6, it will work for you to just tap one of the volume keys to take the picture as well.
[42:44] Fantastic. All right, Chanel, do we have another hand? Yes, we do. We have several. But first, we’re going to go to Jule and then checking Clubhouse. So Jule, go ahead and unmute, please.
Hi. I actually have several questions. I’m doing all right, Michael. Thank you. That’s Marty. Sorry.
Horrible voices. I actually have several questions and they’re not necessarily about taking pictures, but with the Photos app and the OCR apps. So the first question I have is about,
about labeling photos in the Photos app.
Is it better to do a three finger double tap to label or is it better to do a caption through the info button?
And is there a case purpose for the other way?
[43:30] I’ll give my thoughts, but I’d love to hear what Michael and Marty have to say on this. I like the caption because if you change your phone, you move it, you change phones and things
like that, the three-finger double tap is a voiceover label. And so if something changes about that image, right, that can change. But if you caption it, then that is a little different.
So Marty and Michael, what are your thoughts?
I think actually labeling the picture itself is going to be always the best way because then the label will follow the picture around no matter what or where you have the picture. If you do it the other way, it’s only on your device. Once the photo leaves the device, if you didn’t actually label the photo itself, then the text would be gone or the labeling would be gone for that photo and you’d have to do it wherever it landed next.
So to clarify, you mean the captioning label, correct?
Correct. Okay, just clarifying.
[44:30] And then I don’t label my pictures because I’m a horrible person and I don’t really care. I mean, I do, but most of it, I don’t take enough pictures that I would be lost on going through my pictures.
I know what pictures I took on what dates usually. And when I don’t remember, honestly, I don’t need that picture anymore.
So I personally do not label pictures myself. I just go off of the metadata that’s provided to me from voiceover with information inside that image or the date and time of the image.
[45:02] Yeah, I will say I have 500 pictures from the last year including 200 from a recent photo shoot.
So labeling is important.
Yeah, yeah. So there’s nuances. So I did have a couple other questions but we can go to other people and come back if you’d rather.
Yeah, we have a few other hands up. So let’s get to those and then we’ll come back around if we have time, Jewel.
Okay. Do we have anyone in Clubhouse?
[45:28] Yes, Cindy Hollis.
[45:31] Hi. Hey, Cindy. Hey, I love taking pictures. I’m totally blind. I do it all the time and I am told they’re really good.
I wanted to offer a suggestion for taking photos. If you have access to Aira, which most of us do, at least for free, It might be minimal for those that aren’t paying for the service once every 24 hours. I think it is,
Or once every 48 hours. I don’t know it’s changed. But anyway.
[46:01] Having them help you take the picture will give you some ideas on where to move your hand how far distance wise eyes tilting.
[46:13] I knew if we lost her. Yeah, we did.
[46:19] Anyway, tilting, tilting it, distance, all of that stuff.
[46:26] So I just think that that’s another resource.
[46:30] It definitely is. Thank you, Cindy, for sharing that. I meant to mention it because Aira is one of those tools that just keeping your toolbox, you can go take your own pictures, but having sighted interpretation of those images is while you’re in the motion of taking the pictures,
I think it helps me a lot with being able to orientate myself.
And also with lighting. We don’t always think about lighting in our house, in the room, and where the light should be coming from.
If it’s behind us, whether it’s in front of us, those kinds of things, once you start learning that from them, you’ll pick up on it and you’ll go, oh, well, it’s not giving me the information I want.
Oh, maybe I should turn the light on. So, all right, thank you.
And Cindy, I wanna mention that that is really good advice.
[47:23] You know, that having Aira help is a really good way to get a good picture and they are really good at taking good photos.
I don’t know how much control over all of the cameras on your phone they have. So that is one thing to keep in mind as well.
So if anybody knows, if they can use like all three cameras, that’s, you know, that’d be really good information to know.
There are some apps, third party apps.
There are some third-party apps that allow you to use all three cameras. I believe one of them is called Filmic and you can actually use each one of the cameras in a different way separately and individually. So, you know, it costs a little money for it,
but it really enhances what you can do with all three cameras.
[48:13] But some apps just like certain OCR apps just use the one camera. Like you can’t zoom in or out using the other two cameras if you have that system.
So that’s one thing to, and one thing that’s neat about seeing AI and other apps that use the LIDAR is you could actually get a good determination about how far away from an object you are,
using that LIDAR with seeing AI in other apps.
So that’s also a really, if you know I need to be a foot away, then you can use that to determine that.
[48:49] All right. Chanel. Yes. Okay. Next up we have Nora. Nora, you may unmute. Hey, Nora.
How you doing? Doing great. Hi.
Yeah. My question is, I have an iPad 16.1.1, a new iPad, and I have a camera and I’m wondering if I want to take a picture of a document, like say a notice from the management where I live,
how far or any object without any writing, how far distance can I take a picture with the camera?
[49:32] I would say about a foot. Oh okay, I just wondered. Thank you very much. Yeah, no problem. Two foot and a half. Okay, thank you.
Yep. All right, Janelle.
Okay, next up we have Anthony.
[49:50] Hello, I’m also a huge photo person, so I wanted to drop two tips into the conversation. The first one for OCR, I bought a plain black placemat. And if I center the document on the
plain black placemat and use my elbow to my fingertips as the distance, I usually can get it very, very quickly and a very, very clear scan of that. The other thing when I’m helping folks
who are trying to take pictures for the first time with the iPhone, I told them to make a cradle with their left hand using the pointer finger and the thumb, basically making that L and holding the
phone with the right hand and sending it in and holding it to their chin and figuring out where the camera will fit between the two spaces between your eyes above your nose and moving
it forward from there. And then as you turn your head, take your phone, you know, holding it and take it the same way. And what I’ve done it with Ira with folks, but I know you
could do it with seeing AI and certain other. Use the object recognition in an OCR app or,
or Ira, and then try to focus on things before you start taking pictures and get that feedback.
[51:11] And that’ll kind of teach you where you need to hold the camera to get certain things. Like, you know where things in your house are. I have a wine rack and I have a figure on top of the wine rack.
So whenever I have somebody here that needs that, we’ll focus on that first. We’ll get a clearer image of that and then move on to other things. But once you kind of get the orientation of where you’re holding your phone, you’ll be able to take really good pictures.
Right, Anthony, are you using a stock app to capture or what are you using for your app once you have the text captured.
[51:44] I use Microsoft Seeing AI mostly, but I also have the Envision glasses, so I also have the Envision app on the phone and once in a while I use that too.
Cool, thank you. And especially, by the way, for checks, if you want to do, everyone’s like, oh, I can’t do mobile banking, I can’t do mobile banking.
Put the check on the black placemat right in the center, hold it up, you know, an arm’s length up and you’ll get it within, you know, then flip it over and do the other side. And once you do one or two checks, you’ll become an expert.
[52:15] You know, oh, can I add something real quick? You know, with checks, what I typically do for, you know, is I will put the phone on top of the check and then lift it up until like I use Chase for my banking and it can,
auto take the and it’ll tell you.
Yeah.
So I just lift the check. I mean, I lift the phone up, you know, put it down on top of the check, lift the check in the back, lift up the phone over the check, and then it will automatically take the picture.
You know, another thing that I do as a tip for me, I’m low vision, so you know, I will use my phone to take pictures, but what I do is I put the camera basically,
where my right eye is, where my good eye, and that’s the same side of the iPhone.
So I put the camera about where my eye is, and then, you know, may lift up a little bit to see the screen.
But if you make it to where you’re looking through the camera, like where your eye is, like the phone’s in front of your eye,
then you’re going to get a pretty good shot if you’re looking in the direction where the object is.
So that’s another way.
[53:26] Make it to where your phone moves with your head if you move your phone around to find something. Yes.
All right, we’re running a little bit close on time. So let’s try and get these last couple of questions answered real quick.
Yes. Did we have anyone else in Clubhouse? Not right now.
Okay. So let’s go ahead and go to Jane Suh. For me, can somebody explain the how does my iPhone work now with the supposedly you
can use your iPhone camera as your, what is it? Web cam thingy?
Continuity cam. Yeah. Whatever that’s.
[54:17] Yeah. So the way that works and it’s really neat is that you can use a stand, Belkin makes one and you attach it to the top of your computer to your top of your Mac.
[54:29] This is a Mac only feature. There’s other apps that make this work with windows I don’t know what they are off top of my head but built into Mac you can clip your phone to the top of your computer and,
Your phone will just show up as a device that you could pick from in zoom,
Supposedly or or you know any of these apps It works I’ve been using it actually for some live TV content that I’m producing with a couple of companies And you don’t even have to clip it to your phone.
I actually have a tripod sitting behind my MacBook. So it goes over the screen and over my boom arm.
So it doesn’t, well, actually my boom arm goes over it. So it doesn’t see my boom arm, but it sees my face in focus.
And I use the camera app to make sure that it says one face centered before I go in. If you’re a Windows user, you can use the Can You See Me app at canyouseme.app.
But then in zoom I go to the right of the video enable button There’s an option says video drop down you choose that and then you choose camera iPhone camera,
And then your iPhone will make a sound to let you know that it has switched over to that and then your camera is working Just fine. I know we have another hand but real quick if you are on windows and you need to do this My recommendation is a tool called camo camo.
[55:48] Does it use the?
Back facing You can use whatever camera front or back, I think. By default, it’ll use the back facing.
And one other thing, if you’re on a Mac computer, there’s also a setting where you could have one of the cameras facing you and the second camera facing down at the desk at the same time.
If you wanna talk and also demo something, it’ll show your desk and you can show that you’re demoing something either on your phone or something else, putting something together, unboxing something, any of that.
So it allows you to have one camera facing you at your face and one camera facing down at the table at the same time.
So that’s a pretty cool feature also.
[56:32] All right, Chanel, we got a little more. Well, we do, but that’s a repeat. Do we have time?
Yeah, let’s try and get the answer real quick. Okay, Jewel, go ahead and unmute. Okay, so I’ll cover the other question that I had.
Is there an app, an OCR app, that allows you to edit the description that it creates or add an additional description within that app.
[56:57] Hmm So like for example if I have a photo of a cat and it the description and seeing a says it’s a dog Can I change it to say it’s a cat?
[57:09] Um I’m not sure I know that you can one. Yeah.
[57:17] All right, it’d be nice.
[57:21] That might be something to suggest to Microsoft for their seeing AI apps. I’m going to do that. Thank you.
Yeah. Thank you, Jewel. And thank you, Michael, for coming. This has been a very interesting conversation and really makes me remember that there is that important camera in my pocket that I could be using.
And I’ll let Marty close it up for everyone and tell you how to get a replay of today’s call.
[57:49] All right, everyone. Thanks for coming. We are back next week with our regularly scheduled content asking any of your tech questions and trying to get those answered. If you have questions, comments,
suggestions or anything you can email us at unmute presents at gmail.com. And I’d like to thank Chanel and everyone else who helped us out today. And Michael, you got any last parting words for us?
Nope, just that thing. Oh, other Michael or were you asking me?
[58:24] No, I don’t have any parting words Thank you for joining us again Michael And if you guys missed anything or you want to go back and listen to the replay It’ll be on the unmute presents podcast feed here within about an hour.
[58:36] Thanks Michael Dois for coming out. We really appreciate you today and everyone else have a great week and we’ll see you back here next week same time same place and have a great week everybody.
Transcript
Support Unmute Presents by contributing to their tip jar: https://tips.pinecast.com/jar/unmute-presents-on-acb-communi
This podcast is powered by Pinecast. Try Pinecast for free, forever, no credit card required. If you decide to upgrade, use coupon code r-e4dc67 for 40% off for 4 months, and support Unmute Presents.